One foot planted in “Yeehaw!” the other in “yuppie”.

  • 3 Posts
  • 24 Comments
Joined 2 years ago
cake
Cake day: June 11th, 2023

help-circle
  • Pretty sure it’s against the TOS to do that. So if found, the account is simply terminated and it ceases being valuable. That means that even if it’s sold - it’s value isn’t in the games, but your friend network - as a sort of trojan spam/burner account. Which also means that it’s not worth more than a few dollars at MOST unless you’re some big-time twitch streamer with a vast network of steam friends.

    So yeah, just be aware of what you’re getting into. It’s not likely some guy who wants an instant steam library - it’s someone who wants to exploit your friends, family, and acquaintances for money via scams. Don’t be that guy.



  • I think this take is starting to be a bit outdated. There have been numerous films to use Blender. The “biggest” recent one is RRR - https://www.blender.org/user-stories/visual-effects-for-the-indian-blockbuster-rrr/

    Man in the High Castle is also another notable “professional” example - https://www.blender.org/user-stories/visual-effects-for-the-man-in-the-high-castle/

    It’s been slow, but Blender is starting to break into the larger industry. With bigger productions tending to come from non-U.S. producers.

    There is something to be said about the tooling exclusivity in U.S. studios and backroom deals. But ultimately money talks and Autodesk only has so much money to secure those rights and studios only have so much money to spend on licensing.

    I’ve been following blender since 2008 - what we have now is unimaginable in comparison to then. Real commercial viability has been reached (as a tool). What stands in the way now is a combination of entrenched interests and money. Intel shows how that’s a tenuous market position at best, and actively self destructive at worst.

    Ultimately I think your claim that it’s not used by real studios is patently and proveably false. But I will concede that it’s still an uphill battle and moneyed interests are almost impossible to defeat. They typically need to defeat themselves first sorta like Intel did.









  • On a technical level, user count matters less than the user count and comment count of the instances you subscribe to. Too many subscriptions can overwhelm smaller instances and saturate a network from the perspective of Packets Per Second and your ISPs routing capacity - not to mention your router. Additionally, most ISPs block traffic traffic going to your house on Port 80 - so you’d likely need to put it behind a cloudflare tunnel for anything resembling reliability. Your ISP may be different and it’s always worth asking what restrictions they have on self-hosted services (non-business use-cases specifically). Otherwise going with your ISP’s business plan is likely a must. Outside of that, yes, you’ll need a beefy router or switch (or multiple) to handle the constant packets coming into your network.

    Then there’s a security aspect. What happens if you’re site is breached in a way that an attacker gains remote execution? Did you make sure to isolate this network from the rest of your devices? If not, you’re in for a world of hurt.

    These are all issues that are mitigated and easier to navigate on a VPS or cloud provider.

    As for the non-technical issues:

    There’s also the problem of moderation. What I mean by that is that, as a server owner you WILL end up needing to quarantine, report, and submit illegal images to the authorities. Even if you use a whitelist of only the most respectable instances. It might not happen soon, but it’s only a matter of time before your instance happens to be subscribed to a popular external community while it gets a nasty attack. Leaving you to deal with a stressful cleanup.

    When you run this on a homelab on consumer hardware, it’s easier for certain government entities to claim that you were not performing your due diligence and may even be complicit in the content’s proliferation. Now, of course, proving such a thing is always the crux, but in my view I’d rather have my site running on things that look as official as possible. The closer it resembles what an actual business might do, the better I think I’d fare under a more targeted attack - from a legal/compliance standpoint.



  • See: every AAA big game releases lately. Even on Windows, having to nuke your graphics drivers and install a specific version from some random forum is generally accepted as fine like it’s just how PC gaming is.

    Never had to do that since I was ROM hacking an old RX480 for Monero hashrates. In fact, on my Windows 11 partition (Used for HDR gaming which isn’t supported on Linux yet), I haven’t needed to perform a reinstall of the NVIDIA driver even when converting from a QEMU image to a full-fat install.

    When I see those threads, it often comes across as a bunch of gamers just guessing at a potential solution and often become “right” for the “wrong” reasons. Especially when the result is some convoluted combination of installs and uninstalls with “wiping directories and registry keys”.

    But, point taken, the lengths gamers will go to to get an extra 1-2 FPS even if it’s unproven, dangerous, and dumb is almost legendary.




  • I really doubt that. Again - advanced user here - with numerous comparison points to other arch based distros. I also maintain large distributed DB clusters for Fortune 100 companies.

    If it was something not on the latest version - it’s not due to my lack of effort or knowledge, but instead due to the terrible way Garuda is managed.

    What, am I supposed to compile kernel modules from scratch myself? Never needed to do that with Endeavour, Manjaro, or just Arch.

    If Garuda’s install (and subsequent upgrade) doesn’t fetch the latest from the Arch repos, that’s on them.

    EDIT: Also, these non-answers are tiresome, low effort, and provide zero guidance on any matter. I know every single kernel change since 5.0 that impacted my hardware. I have rss feeds for each of the hardware components I have, and if Linux or a distro ships an enhancement to my hardware - I’m usually aware well before it is released. If you were to point to any bit of my hardware I can tell you, for certain, what functionalities are supported, which has bugs, and common workarounds.

    If you want this type of feedback to be valuable, then let me know if a new issue/regression has arisen given the list of hardware I’ve supplied.

    Valuable: “Perhaps it was the latest kernel X which shipped some regressions for Nvidia drivers that causes compositor hitching on KWin”

    Utterly Useless: “It’s very likely some drivers are not up to date or compatible with your system.”





  • I don’t get it either. My brother-in-law is like this. And he refused to take his kids to see Buzz Lightyear because of its “political” nature. I was a dumbfounded when I heard that. To think that representation is just some nebulous political aim.

    At this rate, we should just consider any media with a kiss in it “political media.”

    And I even grew up with this dude in the early 2000s. He didn’t seem like this before.

    I try to forget about the guy, but it’s kind of hard because he won’t let me see the nieces because I’m too “liberal”.


  • I agree. I think 1440p+HDR is probably the way to go for now. HDR is FAR more impactful than a 4K resolution and 1440p should provide a stable 45ish FPS on Cyberpunk 2077 completely maxed out on an RTX 3080Ti (DLSS Performance).

    And in terms of CPU, the same applies. 16 cores are for the gentoo using, source compiling folks like me. 8 cores on a well binned CPU from the last 3 generations goes plenty fast for gaming. CPU bottlenecking only really show up at 144fps+ in most games anyways.