Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)PE
Posts
0
Comments
235
Joined
2 yr. ago

  • Lan-mouse looks great but keep in mind that there’s no network encryption right now. There is a GitHub ticket open and the developer seems eager to add encryption. It’s just worth understanding that all your keystrokes are going across the network unencrypted.

  • Things I will bet money on

    • They will produce no evidence of any wrongdoing uncovered from any of these raids
    • They will give some cryptic statement that tries to make it sound like they did find something
    • Texas lawmakers will continue to not hold Paxton accountable for anything
  • Shoot your shot, player.

    Don’t go crazy or over the top, don’t overdo it, but just say it. If they’re a good friend they won’t be scared away. If they’re like you that way you’ll both be happier.

    Don’t overthink it, ask them if they’d ever like to hang out or do something more like a date.

    Ballsy, direct, badass. That can be you.

    Dating is awkward but life gets a lot better once you get more comfortable with it. Everyone is a dating idiot until they’re not, there’s a good chance your friend is still in the idiot stage and maybe hell be over the moon that you helped push through it.

  • More than distro hopping maybe try out a zen kernel or compiling kernel yourself and changing kernel config and scheduler, or a newer version of the stock kernel?

    I’m not super current on what’s in each kernel but I’d expect latest mainline to handle newer processors better than some of the older stable kernels in some of the more mainstream slower releasing distros.

  • Ran Asahi for several months, tried it out again recently. It’s good/fine, I just don’t love fedora.

    There’s some funkiness with the more complicated install, the AI acceleration doesn’t work, no thunderbolt / docking station.

    MacBooks are great hardware but I don’t think they’re the best option for Linux right now. If you’re never going to boot into macOS then I’d look for x13, new Qualcomm, isn’t there a framework arm64 option now or was that a RISC module?

    I’m also assuming you’re not looking to do any gaming? Because gaming on ARM is not really a thing right now and doesn’t feel like it will be for a long while.

  • I’m really curious how the visor headset gets reviewed and performs. Their subscription pricing model is interesting.

    VR has had some interesting success in the last few years but it feels like a tough job to strike the right balance on cost and performance.

  • Really love arch and the AUR. I've been tempted to get nix set up for the rare cases when there's no AUR package or the AUR package is unmaintained. I figure if there's no package in the AUR or nixpkgs, it's probably not worth running.

  • I’m a Unity noob and even more of a noob in Godot, but the c# development experience is so much better in Godot it’s ridiculous.

    I remember what was it like 6 years ago when Unity announced moving towards .net core. I can appreciate thats a large effort, but they’ve made ridiculously little progress that I can see

  • btop reports some gpu, network and disk information that I don't think shows up in htop, feels a bit more comprehensive maybe? Both are fine, but I too use btop, it's nice.

    Random trivia: I think btop has been rewritten like 3-5 times now? It's sort of an inside joke to the point that someone suggested another rewrite from C++ to Rust ( https://github.com/aristocratos/btop/issues/5 ). I guess the guy just likes writing system monitoring console apps.

  • It's not uncommon on sensitive stories like this for the government to loop-in journalists ahead of time so they can pull together background and research with an agreed-upon embargo until some point in the future.

    This wasn't the US government telling the newspaper they couldn't report on a story they had uncovered from their own investigation.

  • The Rule

    Jump
  • There's quantization which basically compresses the model to use a smaller data type for each weight. Reduces memory requirements by half or even more.

    There's also airllm which loads a part of the model into RAM, runs those calculations, unloads that part, loads the next part, etc... It's a nice option but the performance of all that loading/unloading is never going to be great, especially on a huge model like llama 405b

    Then there are some neat projects to distribute models across multiple computers like exo and petals. They're more targeted at a p2p-style random collection of computers. I've run petals in a small cluster and it works reasonably well.