• 0 Posts
  • 138 Comments
Joined 11 个月前
cake
Cake day: 2024年3月22日

help-circle
  • nyan@sh.itjust.workstoLinux@lemmy.mlAMD vs Nvidia
    link
    fedilink
    arrow-up
    4
    arrow-down
    3
    ·
    4 天前

    I wouldn’t say the proprietary nvidia drivers are any worse than the open-source AMD drivers in terms of stability and performance (nouveau is far inferior to either). Their main issue is that they tend to be desupported long before the hardware breaks, leaving you with the choice of either nouveau or keeping an old kernel (and X version if using X—not sure how things work with Wayland) for compatibility with the old proprietary drivers.


  • nyan@sh.itjust.workstoLinux@lemmy.mlAMD vs Nvidia
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    4 天前

    If those are your criteria, I would go with AMD right now, because only the proprietary driver will get decent performance out of most nVidia cards. Nouveau is reverse-engineered and can’t tap into a lot of features of newer cards especially, and while I seem to recall there is a new open-source driver in the works, there’s no way it’s mature enough to be an option for anyone but testers.



  • On Linux, the OOM reaper should come for the memory cannibal eventually, but it can take quite a while. Certainly it’s unlikely to be quick enough to avoid the desktop going unresponsive for a while. And it may take out a couple of other processes first, since it takes out the process holding the most memory rather than the one that’s trying to allocate, if I recall correctly.


  • Test the network from the lowest level if you haven’t already, using ping and the IPv4 address of a common server (for instance, ping 8.8.8.8) to bypass DNS.

    If it works, your DNS is borked.

    If it doesn’t, then there’s something more fundamentally wrong with your network configuration—I’d guess it was an issue with the gateway IP address, which would mean it can’t figure out how to get to the wider Internet, although it seems super-weird to have that happening with DHCP in the mix. Maybe you left some vestiges of your old configuration behind in a file that your admin GUI doesn’t clean up and it’s overriding DHCP, I don’t know.



  • The performance boost provided by compiling for your specific CPU is real but not terribly large (<10% in the last tests I saw some years ago). Probably not noticeable on common arches unless you’re running CPU-intensive software frequently.

    Feature selection has some knockon effects. Tailored features mean that you don’t have to install masses of libraries for features you don’t want, which come with their own bugs and security issues. The number of vulnerabilities added and the amount of disk space chewed up usually isn’t large for any one library, but once you’re talking about a hundred or more, it does add up.

    Occasionally, feature selection prevents mutually contradictory features from fighting each other—for instance, a custom kernel that doesn’t include the nouveau drivers isn’t going to have nouveau fighting the proprietary nvidia drivers for command of the system’s video card, as happened to an acquaintance of mine who was running Ubuntu (I ended up coaching her through blacklisting nouveau). These cases are very rare, however.

    Disabling features may allow software to run on rare or very new architectures where some libraries aren’t available, or aren’t available yet. This is more interesting for up-and-coming arches like riscv than dying ones like mips, but it holds for both.

    One specific pro-compile case I can think of that involves neither features nor optimization is that of aseprite, a pixel graphics program. The last time I checked, it had a rather strange licensing setup that made compiling it yourself the best choice for obtaining it legally.

    (Gentoo user, so I build everything myself. Except rust. That one isn’t worth the effort.)




  • Thing is, even when Ubuntu’s software has been packaged outside Ubuntu, it’s so far failed to gain traction. Upstart and Unity were available from a Gentoo overlay at one point, but never achieved enough popularity for anyone to try to move them to the main tree. I seem to recall that Unity required a cartload of core system patches that were never upstreamed by Ubuntu to be able to work, which may have been a contributing factor. It’s possible that Ubuntu doesn’t want its homegrown software ported, which would make its contribution to diversity less than useful.

    I’d add irrational hate against Canonical to the list of possible causes.

    Canonical’s done a few things that make it quite rational to hate them, though. I seem to remember an attempt to shoehorn advertising into Ubuntu, à la Microsoft—it was a while ago and they walked it back quickly, but it didn’t make them popular.

    (Also, I’m aware of the history of systemd, and Poettering is partly responsible for the hatred still focused on the software in some quarters. I won’t speak to his ability as a programmer or the quality of the resulting software, but he is terrible at communication.)

    And you have fixed versions every half a year with a set of packages that is guaranteed to work together. On top of that, there’s an upgrade path to the next version - no reinstall needed.

    I’ve been upgrading one of my Gentoo systems continuously since 2008 with no reinstalls required—that’s the beauty of a rolling-release distro. And I’ve never had problems with packages not working together when installing normally from the main repository (shooting myself in the foot in creative ways while rolling my own packages or upgrades doesn’t count). Basic consistency of installed software should be a minimum requirement for any distro. I’m always amazed when some mainstream distro seems unable to handle dependencies in a sensible manner.

    I have nothing against Ubuntu—just not my cup of tea for my own use—and I don’t think it’s a bad distro to recommend to newcomers (I certainly wouldn’t recommend Gentoo!) Doesn’t mean that it’s the best, or problem-free, or that its homegrown software is necessarily useful.


  • On the one hand, diversity is usually a good thing for its own sake, because it reduces the number of single points of failure in the system.

    On the gripping hand, none of Ubuntu’s many projects has ever become a long-term, distro-agnostic alternative to whatever it was supposed to replace, suggesting either low quality or insufficient effort.

    I’m . . . kind of torn. Not that I’m ever likely to switch from Gentoo to Ubuntu, so I guess it’s a moot point.






  • A whole bunch of non-user-facing projects providing vital libraries that are largely ignored until something blows up in people’s faces, as happened with openssl some years ago. Some of them contain quite a bit of code (for example, ffmpeg, which underpins a lot of open-source media playback software). Among browsers specifically, Pale Moon has been around for years, is maintaining a lot of code no longer carried by Firefox along with a fair amount of original code, and has no cash source beyond user donations, which might stretch to paying for the servers in a good month.

    The projects with corporate sponsorship, or even a steady flow of large donations, are in the minority. There’s a reason the xkcd about the “project some random person in Nebraska has been thanklessly maintaining since 2003” exists.


  • Money isn’t important. Some complex software is, in fact, maintained by unpaid volunteers who feel strongly about the project. That doesn’t mean it’s easy (in fact it’s quite difficult to keep the lights on and the code up-to-date), but it is A Thing That Happens despite being difficult.

    What is important is the size of the codebase (in the case of a fork, that’s the code either written for the fork or code that the fork preserves and maintains that isn’t in the original anymore), the length of time it’s been actively worked on, and the bus factor. Some would-be browser forks are indeed trivial and ephemeral one-man shows. Others have years of active commit history, carry tens or even hundreds of thousands of lines of novel or preserved code, and have many people working on them.



  • If you’re interested in doing the tech equivalent of a party trick (except that it’s less interesting to watch), go ahead and try. You’ll probably just end up reinstalling almost every package on the system that differs between the base distro and the offshoot. Harmless, but also pointless, since you could just have installed Debian from the get-go and saved yourself a lot of trouble.

    There are a whole bunch of Very Silly Things you can do in the Linux world that aren’t worth the effort unless your income relies on the creation of niche Youtube vids. For instance, it should theoretically be possible to convert a system from Debian to Gentoo without wiping and reinstalling. I’m not going to try it.