- > it still ran fine (20+ fps)
20 fps is not fine. I would consider that unplayable.
I expect at least 60, ideally 120 or more, as that's where the diminishing returns really start to kick in.
I could tolerate as low as 30 fps on a game that did not require precise aiming or reaction times, which basically eliminates all shooters.
- > 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity.
Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?
If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.
> HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.
- I don't expect the manufacturers to change.
I do expect people to change though.
How is it that it's been well known that smart TVs will show ads and spy on you for over 10 years, and yet people are still connecting their TVs to their WiFi rather than get a separate dedicated streaming device?
I just don't get it. How are people still surprised to find their TV is spying and will show ads?
- I don't use the Smart features and instead use a $30 Amazon Fire TV stick (for streaming services) and a Raspberry Pi (for torrents).
This has the major advantage that if the streaming hardware is ever obsoleted for any reason (ie, Netflix decides my TV is too old to support a compression codec they want to switch to), I only have to buy a new media player for $30 and not a whole new TV.
- I don't even know the last time I pirated music. Gotta be at least 10 years.
Meanwhile, I pirate movies/TV on a regular basis for the reasons you gave. At one point, I was subbed to 5 services, and decided enough was enough. Cancelled all but Netflix and went back to torrenting anything they didn't have.
- Why would they look silly?
We don't refer to the year 700 as 0700. Tt'll be perfectly natural in 8000 years to see "10025" for the year.
- If NVIDIA and AMD aren't making GPUs, who's going to make the GPU for the Steam Machine?
- > Are there any popular games with a system requirement greater than a 2060?
Requirement, probably not. You can run Cyberpunk on minimal settings on a 2060. But on Maximum settings in 4K with HDR, even an RTX 5090 requires DLSS frame gen. But it looks stunning.
I think PC gamers have a higher standard when it comes to graphical quality and performance. Many console gamers have been convinced that 30 fps is fine. Meanwhile, PC gamers are likely on 144 hz or higher monitors and expect 144+ fps.
- In the mid 90s, sure, but prices were coming down fast.
There was a sweet spot in ~1999 where you could buy an A-Bit BH6 motherboard, 300 Mhz Celeron A (overclocked to 450 Mhz), 3dfx Banshee, and all the other components to make a PC for only ~$500 and have a respectable gaming machine.
- > there is still a market there, someone will fill it.
How? The cost to design a GPU from scratch is astronomical. Even if a bunch of top brass engineers from NVIDIA left the company to start their own, I'd be surprised if trying to apply as much of their knowledge as possible didn't result in patent violations.
Even ignoring the design part, you have to deal with actual manufacturing. TSMC is ostensibly the only guys in town that could make it, but their fabs are already occupied by NVIDIA's orders. Building your own fab is billions of dollars and several years.
- A friend of mine posed a question...
What happens if in a couple years from now, NVIDIA decides that the opportunity cost of manufacturing gaming GPUs instead of data center GPUs becomes so high that they decide to step out of the gaming hardware business altogether, what happens to PC gaming?
NVIDIA and AMD both use TSMC to manufacture the GPUs, so if NVIDIA made that decision, presumably, AMD would do so as well. That basically leaves Intel, but their GPUs are lackluster and the drivers are awful.
Gaming PCs have become ludicrously expensive. Even mid-grade builds are $2,000. You can't really build a decent gaming PC for under $1,500 if you want to play the latest AAA games.
- I hate the term "leak". It used to have meaning.
Now, it's either a fancy term for "announcement", or people use it synonymously with "rumor".
- 100% this
The PCI-Express bus is actually rather slow. Only ~63 GB/s, even with PCIe 5 x16!
PCIe is simply not a bottleneck for gaming. All the textures and models are loaded into the GPU once, when the game loads, then re-used from VRAM for every frame. Otherwise, a scene with a lowly 2 GB of assets would cap out at only ~30 fps.
Which is funny to think about historically. I remember when AGP first came out, and it was advertised as making it so GPUs wouldn't need tons of memory, only enough for the frame buffers, and that they would stream texture data across AGP. Well, the demands for bandwidth couldn't keep up. And now, even if the port itself was fast enough, the system RAM wouldn't be. DDR5-6400 running in dual-channel mode is only ~102 GB/s. On the flip side the RTX 5050, a current-gen budget card, has over 3x that at 320 GB/s, and on the top end, the RTX 5090 is 1.8 TB/s.
- Depends wildly on what you're doing.
I'm a gamer, often playing games that need a BEEFY CPU, like MS Flight Simulator. My upgrade from an i9-9900K to a Ryzen 9800X3D was noticeable.
- The attacker can do anything using your session.
The "Hello world" examples always show using it to steal your cookies, which obviously doesn't work now when nearly every site uses the "httpOnly" flag which makes the cookie inaccessible to JavaScript, but really, stealing your session isn't necessary. They just have to make the XSS payload run the necessary JavaScript.
Once the JavaScript is running on the page, all bets are off. They can do ANYTHING that the page can do, because now they can make HTTP requests on your behalf. SOP no longer applies. CSRF no longer protects you. The attacker has full control of your account, and all the requests will appear to come from YOUR browser.
- Let me put it this way...
In one of my penetration testing training classes, in one of the lessons, we generated a malicious PDF file that would give us a shell when the victim opened it in Adobe.
Granted, it relied on a specific bug in the JavaScript engine of Adobe Reader, so unless they're using a version that's 15 years old, it wouldn't work today, but you can't be too cautious. 0-days can always exist.
- XXE should have never existed.
Whoever decided it should be enabled by default should be put into some sort of cybersecurity jail.
- Another option:
Deliberate heat generation.
If it's cold and you're going to be running a heater anyways, then if your heat is resistive, then running a cryptominer is just as efficient and returns a couple dollars back to you. It effectively becomes "free" relative to running the heater.
If you use a heat pump, or you rely on burning something (natural gas, wood, whatever) to generate heat, then the math changes.
- I should check my SSH logs.
My intuition is that since the SSH server reports what auth methods are available, once a bot sees that password auth is disabled, they will disconnect and not try again.
But I also know that bots can be dumb.
Old enough to learn that it's a sociopathic stance that has no business in a well-functioning society.
You're arguing in favor of what's essentially a scam.