Hacker Newsnew | past | comments | ask | show | jobs | submit | Zardoz84's commentslogin

I agree with you. Also, there is people (like me) that like to small commits (that don't break stuf) instead of huge mega commits. If I do something like small broken/wip commits, are only under my working bramch and I do a interactive rebase to merge on good cohesive commits.

The past was bad. But the current is far worse. Tell it to the people disappeared in the ICE concentration camps. Or to any trans people in any bad state.

Windows 3.1, with the aproppiated drivers and modern SVGA card, had accelerated 2d graphics. Accelerated GUIs don't even need GPU or 3d.

What does "GPU" mean here? Previous uses of the term seemed to imply "dedicated hardware for improving rendering performance" which the SVGA stuff would seem to fall squarely under.

The term GPU was first coined by Sony for the PlayStation with its 3D capabilities, and has been associated with 3D rendering since. In some products it stood for Geometry Processing Unit, again referring to 3D. Purely 2D graphics coprocessors generally don’t fall under what is considered a GPU.

It has been associated with 3D rendering, but given that things like the S3 86C911 are listed on the Wikipedia GPU page, saying "Accelerated GUIs don't need GPU" feels like attempting to win an argument by insisting on a term definition that is significantly divergent from standard vulgar usage [1], which doesn't provide any insight to the problem originally being discussed.

[1] Maybe I've just been blindly ignorant for 30 years, but as far as I could tell, 'GPU' seemed to emerge as a more Huffman-efficient encoding for the same thing we were calling a 'video card'


I don’t agree with what you state as the vulgar usage. “Graphics card” was the standard term a long time, even after they generally carried a (3D) GPU. Maybe up to around 2010 or so? There was no time when you had 2D-only graphics cards being called GPUs, and you didn’t consciously buy a discrete GPU if you weren’t interested in (3D) games or similar applications.

In the context of the discussion, the point is that you don’t need high-powered graphics hardware to achieve a fast GUI for most types of applications that WPF would be used for. WPF being slow was due to architectural or implementation choices.


That's the real takeaway - WPF should have degraded gracefully (read, full speed performance without the bling) but it didn't.

We used to call those things a "Video Card", which you put into your computer to get a video signal out.

Back in the day there was a card called an S3 Virge, which we affectionally called a 3D decelerator card, because of its lacklustre 3D performance.


Most people consider GPU to mean "3D accelerator" though technically it refers to any coprocessor that can do work "for" the main system at the same time.

GPU-accelerated GUI usually refers to using the texture mapping capabilities of a 3D accelerator for "2D" GUI work.


I remember these disk from my Spectrum +3 . Indeed more hard and resistant that the 3.5" . Sad, that the format was on the losing side and never evolved beyond the 128k (or was 256k?) that could store on a single side.


I strongly suggest any other distro that is Ubuntu. Canonical is a Microsoft wannabe

Fighting off snaps would be reason enough to abandon them but Canonical has control of the snap store in a way that is antithesis to open source as they're trying to run a walled garden play. This is the exact type of crap that lit a fire under my ass to get off Windows in the first place.

There is a weird bug in Helldivers 2 when you use the maximized window without borders and you have a multi monitor setup. Sometimes, the character stops of rotating with the mouse look... like if the mouse cursor found a limit. Another, it's with the full screen mode. When I switch windows with AltTab, sometimes the game restores with the wrong display proportions.

As far I know, this bugs happens on Windows too.


I don't paid anything to YouTube and I don't see any ads. Because I block ads.

Do you feel good about YouTube spending money on hosting and video producers spending time/money on content that you're paying nothing for? How is that sustainable?

The categorical imperative has been put on life support since 2016 at the latest.

Everything is smash and dash now. And nobody with the means to change it cares about externalities anymore.


I don't understand any of those sentences. How do they apply to YouTube?

Frankly? That's Google's (well, Alphabet's, I guess) problem.

They're a multibillion-dollar international monopoly with absolutely staggering amounts of money and power, actively engaging in a wide variety of activities directly aimed at making the lives of every normal person on the planet worse so that they can have more power, more control, and more money. Me blocking ads on YouTube not only costs them effectively nothing, it's also the act of a flea against a polar bear.

If Alphabet showed any signs of actually wanting to create a sustainable alternative to the surveillance economy, I might have some sympathy for them. But not only do they not do this, they are the ones who created it in the first place.


Doesn't your boycott of the ad-free model confirm to them that the only viable business model is ads?

I'm not sure where you got the idea that I'm boycotting "the ad-free model".

I'm boycotting them. After all, every cent that goes their way supports surveillance advertising (among other unsavory things).

I have other subscriptions that support ad-free creators.

If they choose to misconstrue my refusal to support them with either money or ad views, that's also their problem. (Also, that's patently never going to happen, because my signal vanishes instantly into the noise.)


My first computer whole RAM could fit in L1 of a single core (128k)

They add 100 to the 486, but they got 585.99999999.... so they called it Pentium xD


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: