• 0 Posts
  • 24 Comments
Joined 8 months ago
cake
Cake day: April 3rd, 2024

help-circle


  • Flatpak has its benefits, but there are tradeoffs as well. I think it makes a lot of sense for proprietary software.

    For everything else I do prefer native packages since they have fewer issues with interop. The space efficiency isn’t even that important to me; even if space issues should arise, those are relatively easy to work around. But if your password manager can’t talk to your browser because the security model has no solution for safe arbitrary IPC, you’re SOL.





  • Jesus_666@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Oh yeah, the equation completely changes for the cloud. I’m only familiar with local usage where you can’t easily scale out of your resource constraints (and into budgetary ones). It’s certainly easier to pivot to a different vendor/ecosystem locally.

    By the way, AMD does have one additional edge locally: They tend to put more RAM into consumer GPUs at a comparable price point – for example, the 7900 XTX competes with the 4080 on price but has as much memory as a 4090. In systems with one or few GPUs (like a hobbyist mixed-use machine) those few extra gigabytes can make a real difference. Of course this leads to a trade-off between Nvidia’s superior speed and AMD’s superior capacity.


  • Jesus_666@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 month ago

    These days ROCm support is more common than a few years ago so you’re no longer entirely dependent on CUDA for machine learning. (Although I wish fewer tools required non-CUDA users to manually install Torch in their venv because the auto-installer assumes CUDA. At least take a parameter or something if you don’t want to implement autodetection.)

    Nvidia’s Linux drivers generally are a bit behind AMD’s; e.g. driver versions before 555 tended not to play well with Wayland.

    Also, Nvidia’s drivers tend not to give any meaningful information in case of a problem. There’s typically just an error code for “the driver has crashed”, no matter what reason it crashed for.

    Personal anecdote for the last one: I had a wonky 4080 and tracing the problem to the card took months because the log (both on Linux and Windows) didn’t contain error information beyond “something bad happened” and the behavior had dozens of possible causes, ranging from “the 4080 is unstable if you use XMP on some mainboards” over “some BIOS setting might need to be changed” and “sometimes the card doesn’t like a specific CPU/PSU/RAM/mainboard” to “it’s a manufacturing defect”.

    Sure, manufacturing defects can happen to anyone; I can’t fault Nvidia for that. But the combination of useless logs and 4000-series cards having so many things they can possibly (but rarely) get hung up on made error diagnosis incredibly painful. I finally just bought a 7900 XTX instead. It’s slower but I like the driver better.



  • True, although that has happened with F/OSS as well (like with xz or the couple times people put Bitcoin miners into npm packages). In either case it’s a lot less likely than the software simply ceasing to be supported, becoming gradually incompatible with newer systems, and rotting away.

    Except, of course, that I can pick up the decade-old corpse of an open source project and try to make it work on modern systems, despite how painful it is to try to get a JavaFX application written for Java 7 and an ancient version of Gradle to even compile with a recent JDK. (And then finally give up and just run the last Windows release with its bundled JRE in Wine. But in theory I could’ve made it work!)


  • Note that this specifically talks about proprietary platforms. Locally-run proprietary freeware has entirely different potential issues, mostly centered around the developer stopping to maintain it. Locally-run F/OSS has similar issues, actually, but lessened by the fact that someone might later pick up the project and continue it.

    Admittedly, platforms are very common these days because the web is an easily accessible cross-platform GUI toolkit SaaS is more easily monetized.


  • True. Just this weekend I spent far too much time trying to get a printer to work again on Windows after its IP address got changed. In the end Windows refused to talk to the printer unless I removed and then readded the device from the Settings app, which prompted a reinstallation of the device driver. No, just changing the IP address in the device settings wasn’t enough; Windows insisted on the driver being reinstalled.

    Linux didn’t need reconfiguration; it just autodetected that the printer had moved.

    I’m not saying that Linux is without issues, not by far. But Windows has never been terribly “it just works” for me either. The closest to “it just works” was (aptly) OS X somewhere around Snow Leopard.


  • Jesus_666@lemmy.worldtoGames@lemmy.worldWhat's your favorite controller?
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    3 months ago

    Honestly, it’s still the F310 for me. I have mine since the early 2010s and it’s still working perfectly. Those things are built like tanks and between XInput and DirectInput are compatible with just about any PC game of the last forty years, no extra software required. Also, they’re dirt cheap.

    Honorable mention to the F710, the wireless version. While Windows 10’s USB stack unfortunately broke compatibility with it (causing randomly dropped inputs), Linux does not have that problem.


  • In my experience rear-mounted sensors are the most accurate, closely followed by under-screen sensors. Side-mounted sensors are utter garbage.

    Accuracy isn’t even that much of an issue, it’s that the side-mounted ones are far too easy to accidentally trigger just by handling the phone. I can’t count the number of times my last two phones told me I had three incorrect fingerprint attempts after I had just pulled them out of my pocket.

    Then I got a Pixel and I have no more such issues and virtually perfect accuracy. Same on a Samsung tablet. Same on an old phone I had where the power button was on the rear and had a full-size sensor.

    Basically, I’m perfectly happy with any front- or rear-mounted full-size sensor. Those tiny side-mounted ones suck.



  • NTFS feels rock solid if you use only Windows and extremely janky if you dual-boot. Linux currently can’t really fix NTFS volumes and thus won’t mount them if they’re inconsistent.

    As it happens, they’re inconsistent all the time. I’ve had an NTFS volume become dirty after booting into Windows and then shutting down. Not a problem for Windows but Linux wouldn’t touch the volume until I’d booted into Windows at least once.

    I finally decided to use a storage upgrade to move most drives to Btrfs save for the Windows system volume and a shared data partition that’s now on ExFAT because it’s good enough for it.





  • Jesus_666@lemmy.worldtoLinux@lemmy.mlKDE Plasma needs stability
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    Mind you, the real winner is of course Android. It has a consistent, easy to learn interface and a wide range of applications that integrate nicely.

    And we don’t need to speculate; it has already won and is the true face of Linux for the masses. Plenty of young people don’t even own traditional computers anymore and do everything on their smartphone or tablets.

    And that’s why this entire discussion is really just a form of fan wank; we don’t need to find a unified UI for Linux because it has already been found and has a massive market share. You may not like it but this is what peak performance looks like.

    Everything else can be as complicated, janky, or exotic as it wants because it doesn’t matter.