Skip Navigation
RE: Is Ernest still here?
  • Get well soon, and thanks for the update.

  • What is going on with kbin - a week has passed with no sign of any life
  • defaming them without due diligence, think about that before continuing

    The irony here is unbelievable rofl you can't make this up. My previous statement was calling you childish and desperate for attention. Thanks for reminding me of that fact, so I can stop wasting my time. It is very clear you're not interested in a genuine and constructive conversation.

  • What is going on with kbin - a week has passed with no sign of any life
  • It's not one week of inactivity, is has been going on for months

    Looks at 2 months straight of kbin devlogs since October, when the man was having pretty significant personal issues

    Not to mention he was: recently sick; tended to financial issues, and personal matters; formalities relating to the project. This isn't even mentioning that he communicated this in the devlog magazine. Or the fact that he has implemented suggestions multiple times at the request of the community to enhance QoL, and allowed users to have agency in making mod contributions.

    You might want to take your own advice. This has also allowed me to revise my earlier statement. You people are actually insane.

  • What is going on with kbin - a week has passed with no sign of any life
  • every post I see from them further paints them as very childish and desperate for attention.

  • Cross posting norms and etiquette?
  • I just use it to bring awareness to similar magazines/communities across the fediverse

  • What is going on with kbin - a week has passed with no sign of any life
  • Agreed, every post I see from them further paints them as very childish and desperate for attention.

  • To those genuinely interested in moderating
  • Since @ernest is the owner for that magazine, I think moderator requests have to go through him. Unfortunately, he was dealing with a slight fever awhile ago, and has been dealing with financial planning and project formalities awhile back as well. Hopefully things haven't gotten worse. For what it's worth, I think it's great you're eager to contribute. There have definitely been some spam issues recently. I hope a solution can be found soon. Maybe even something like posts which have a <10% upvote-to-downvote ratio over a day/week can be temporarily quarantined until an admin approves of it. Anyways, best of luck with modship.

  • The Legend of Zelda Orchestra Concert set for February 9
  • Came here to post because I've also seen The Symphony of the Goddess live. The poster for it is behind me at the moment. Great experience.

  • I feel like I'm missing out by not distro-hopping
  • I've only felt the need to change distros once, from Linux Mint to EndeavourOS, because I wanted Wayland support. I realize there were ways to get Wayland working on Mint in the past, but I've already made the switch and have already gotten used to my current setup. I personally don't feel like I'm missing out by sticking to one distro, tbh. If you're enjoying Mint, I'd suggest to stick with it, unless another distro fulfills a specific need you can't get on Mint.

  • What's the quickest way to find a magazine/community you've subscribed to?
  • You could make a (private) collection for your subscribed magazines. Not exactly the feature you were asking for, but it's an option to curate your feed. On Firefox I have various collections bookmarked and tagged so accessibility is seamless.

  • To all moderators: Here is how you can add banner using CSS very easily to your Kbin magazines!
  • I imagine something like this

    Duly noted, I missed a line of text. Won't try to help in the future

  • Old School Cool @lemmy.world daredevil @kbin.social
    Johnny Depp & Matthew Perry (1988)
    3
    Terminal Trove - The $HOME of all things in the terminal.
    terminaltrove.com Terminal Trove - The $HOME of all things in the terminal.

    Terminal Trove showcases the best of the terminal, Discover a collection of CLI, TUI, and more developer tools at Terminal Trove.

    Terminal Trove - The $HOME of all things in the terminal.

    Terminal Trove showcases the best of the terminal, Discover a collection of CLI, TUI, and more developer tools at Terminal Trove.

    1
    イニシエノウタ/デボル (Song of the Ancients / Devola) [Keiichi Okabe]

    イニシエノウタ/デボル · SQUARE ENIX MUSIC · 岡部 啓一 · MONACA

    NieR Gestalt & NieR Replicant Original Soundtrack

    Released on: 2010-04-21

    0
    What shows have extended therapy arcs?
  • Came here with this show in mind. Would recommend.

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
  • I haven't, but I'll keep this in mind for the future -- thanks.

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
  • I believe I was when I tried it before, but it's possible I may have misconfigured things

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
  • I'll give it a shot later today, thanks

    edit: Tried out mistral-7b-instruct-v0.1.Q4_K_M.ggufvia the LM Studio app. it runs smoother than I expected -- I get about 7-8 tokens/sec. I'll definitely be playing around with this some more later.

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
  • That's good to know. I do have 8GB VRAM, so maybe I'll look into it eventually.

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
  • I'm looking forward to the day where these tools will be more accessible, too. I've tried playing with some of these models in the past, but my setup can't handle them yet.

  • Everybody’s talking about Mistral, an upstart French challenger to OpenAI
    arstechnica.com Everybody’s talking about Mistral, an upstart French challenger to OpenAI

    "Mixture of experts" Mixtral 8x7B helps open-weights AI punch above its weight class.

    Everybody’s talking about Mistral, an upstart French challenger to OpenAI

    On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej Karpathy and Jim Fan. That means we're closer to having a ChatGPT-3.5-level AI assistant that can run freely and locally on our devices, given the right implementation.

    Mistral, based in Paris and founded by Arthur Mensch, Guillaume Lample, and Timothée Lacroix, has seen a rapid rise in the AI space recently. It has been quickly raising venture capital to become a sort of French anti-OpenAI, championing smaller models with eye-catching performance. Most notably, Mistral's models run locally with open weights that can be downloaded and used with fewer restrictions than closed AI models from OpenAI, Anthropic, or Google. (In this context "weights" are the computer files that represent a trained neural network.)

    Mixtral 8x7B can process a 32K token context window and works in French, German, Spanish, Italian, and English. It works much like ChatGPT in that it can assist with compositional tasks, analyze data, troubleshoot software, and write programs. Mistral claims that it outperforms Meta's much larger LLaMA 2 70B (70 billion parameter) large language model and that it matches or exceeds OpenAI's GPT-3.5 on certain benchmarks, as seen in the chart below. A chart of Mixtral 8x7B performance vs. LLaMA 2 70B and GPT-3.5, provided by Mistral.

    The speed at which open-weights AI models have caught up with OpenAI's top offering a year ago has taken many by surprise. Pietro Schirano, the founder of EverArt, wrote on X, "Just incredible. I am running Mistral 8x7B instruct at 27 tokens per second, completely locally thanks to @LMStudioAI. A model that scores better than GPT-3.5, locally. Imagine where we will be 1 year from now."

    LexicaArt founder Sharif Shameem tweeted, "The Mixtral MoE model genuinely feels like an inflection point — a true GPT-3.5 level model that can run at 30 tokens/sec on an M1. Imagine all the products now possible when inference is 100% free and your data stays on your device." To which Andrej Karpathy replied, "Agree. It feels like the capability / reasoning power has made major strides, lagging behind is more the UI/UX of the whole thing, maybe some tool use finetuning, maybe some RAG databases, etc."

    Mixture of experts

    So what does mixture of experts mean? As this excellent Hugging Face guide explains, it refers to a machine-learning model architecture where a gate network routes input data to different specialized neural network components, known as "experts," for processing. The advantage of this is that it enables more efficient and scalable model training and inference, as only a subset of experts are activated for each input, reducing the computational load compared to monolithic models with equivalent parameter counts.

    In layperson's terms, a MoE is like having a team of specialized workers (the "experts") in a factory, where a smart system (the "gate network") decides which worker is best suited to handle each specific task. This setup makes the whole process more efficient and faster, as each task is done by an expert in that area, and not every worker needs to be involved in every task, unlike in a traditional factory where every worker might have to do a bit of everything.

    OpenAI has been rumored to use a MoE system with GPT-4, accounting for some of its performance. In the case of Mixtral 8x7B, the name implies that the model is a mixture of eight 7 billion-parameter neural networks, but as Karpathy pointed out in a tweet, the name is slightly misleading because, "it is not all 7B params that are being 8x'd, only the FeedForward blocks in the Transformer are 8x'd, everything else stays the same. Hence also why total number of params is not 56B but only 46.7B."

    Mixtral is not the first "open" mixture of experts model, but it is notable for its relatively small size in parameter count and performance. It's out now, available on Hugging Face and BitTorrent under the Apache 2.0 license. People have been running it locally using an app called LM Studio. Also, Mistral began offering beta access to an API for three levels of Mistral models on Monday.

    26
    Everybody’s talking about Mistral, an upstart French challenger to OpenAI
    arstechnica.com Everybody’s talking about Mistral, an upstart French challenger to OpenAI

    "Mixture of experts" Mixtral 8x7B helps open-weights AI punch above its weight class.

    Everybody’s talking about Mistral, an upstart French challenger to OpenAI

    On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej Karpathy and Jim Fan. That means we're closer to having a ChatGPT-3.5-level AI assistant that can run freely and locally on our devices, given the right implementation.

    Mistral, based in Paris and founded by Arthur Mensch, Guillaume Lample, and Timothée Lacroix, has seen a rapid rise in the AI space recently. It has been quickly raising venture capital to become a sort of French anti-OpenAI, championing smaller models with eye-catching performance. Most notably, Mistral's models run locally with open weights that can be downloaded and used with fewer restrictions than closed AI models from OpenAI, Anthropic, or Google. (In this context "weights" are the computer files that represent a trained neural network.)

    Mixtral 8x7B can process a 32K token context window and works in French, German, Spanish, Italian, and English. It works much like ChatGPT in that it can assist with compositional tasks, analyze data, troubleshoot software, and write programs. Mistral claims that it outperforms Meta's much larger LLaMA 2 70B (70 billion parameter) large language model and that it matches or exceeds OpenAI's GPT-3.5 on certain benchmarks, as seen in the chart below.

    The speed at which open-weights AI models have caught up with OpenAI's top offering a year ago has taken many by surprise. Pietro Schirano, the founder of EverArt, wrote on X, "Just incredible. I am running Mistral 8x7B instruct at 27 tokens per second, completely locally thanks to @LMStudioAI. A model that scores better than GPT-3.5, locally. Imagine where we will be 1 year from now."

    LexicaArt founder Sharif Shameem tweeted, "The Mixtral MoE model genuinely feels like an inflection point — a true GPT-3.5 level model that can run at 30 tokens/sec on an M1. Imagine all the products now possible when inference is 100% free and your data stays on your device." To which Andrej Karpathy replied, "Agree. It feels like the capability / reasoning power has made major strides, lagging behind is more the UI/UX of the whole thing, maybe some tool use finetuning, maybe some RAG databases, etc."

    Mixture of experts

    So what does mixture of experts mean? As this excellent Hugging Face guide explains, it refers to a machine-learning model architecture where a gate network routes input data to different specialized neural network components, known as "experts," for processing. The advantage of this is that it enables more efficient and scalable model training and inference, as only a subset of experts are activated for each input, reducing the computational load compared to monolithic models with equivalent parameter counts.

    In layperson's terms, a MoE is like having a team of specialized workers (the "experts") in a factory, where a smart system (the "gate network") decides which worker is best suited to handle each specific task. This setup makes the whole process more efficient and faster, as each task is done by an expert in that area, and not every worker needs to be involved in every task, unlike in a traditional factory where every worker might have to do a bit of everything.

    OpenAI has been rumored to use a MoE system with GPT-4, accounting for some of its performance. In the case of Mixtral 8x7B, the name implies that the model is a mixture of eight 7 billion-parameter neural networks, but as Karpathy pointed out in a tweet, the name is slightly misleading because, "it is not all 7B params that are being 8x'd, only the FeedForward blocks in the Transformer are 8x'd, everything else stays the same. Hence also why total number of params is not 56B but only 46.7B."

    Mixtral is not the first "open" mixture of experts model, but it is notable for its relatively small size in parameter count and performance. It's out now, available on Hugging Face and BitTorrent under the Apache 2.0 license. People have been running it locally using an app called LM Studio. Also, Mistral began offering beta access to an API for three levels of Mistral models on Monday.

    0
    Best/usable free Evernote alternative
  • Yeah I wanted to use it for work until I read that. Instead I'm just using Vimwiki since I really only need markdown and linking.

  • Gaming @kbin.social daredevil @kbin.social
    Resident Evil 4 Remake wins PlayStation Game of the Year at the Golden Joystick Awards 2023

    Resident Evil 4 Remake has been crowned PlayStation Game of the Year at The Golden Joysticks 2023 powered by Intel.

    Capcom's third Resident Evil remake was released in March of this year and took players back to rural Spain to confront the mysterious, and deadly, Los Illuminados cult - 18 years after we originally did on the PlayStation 2. Fans clearly loved revisiting the classic survival horror game as it managed to beat out other games in the category including Final Fantasy 16, Street Fighter 6, and Star Wars Jedi: Survivor.

    The other Golden Joystick Awards 2023 nominees in this category can be found below:

    • Final Fantasy 16
    • Resident Evil 4 Remake (winner)
    • Street Fighter 6
    • Humanity
    • Armored Core 6: Fires of Rubicon
    • Star Wars Jedi: Survivor
    4
    Gaming @kbin.social daredevil @kbin.social
    Final Fantasy 7 Rebirth wins Most Wanted Game at the Golden Joystick Awards 2023

    Final Fantasy 7 Rebirth has won the Most Wanted Game category at the Golden Joystick Awards 2023 powered by Intel.

    Due in February of next year, Square Enix's much-anticipated follow-up marks the second part of a planned three-part modern-day reimagining of its 1997 source material.

    Hot on the heels of 2020's Final Fantasy 7 Remake, Final Fantasy 7 Rebirth extends the legendary story beyond Midgar – with a recent trailer teasing familiar spots such as Cid's Rocket Town, Red XIII's Cosmo Canyon, and the indelible Gold Saucer theme park.

    Add flashes of an introspective Sephiroth, Jenova, Junon Harbor and that thoroughfare-dominating parade, and it's easy to see why people are looking forward to this one, and, indeed, why it's come out on top of this year's Golden Joysticks' Most Wanted category.

    Throw in the teasiest of Emerald Weapon teasers, and… yeah, February 29, 2024 really can't come soon enough. Full credit to Final Fantasy 7 Rebirth rising to the top of its 20-game-strong category.

    Here's the full list of Most Wanted Game Golden Joystick 2023 nominees, and as you can see Final Fantasy 7 Rebirth beat 19 other games to come out on top:

    • Death Stranding 2

    • Star Wars Outlaws

    • Final Fantasy VII Rebirth (Winner)

    • Tekken 8

    • Vampire: The Masquerade - Bloodlines 2

    • S.T.A.L.K.E.R. 2: Heart of Chornobyl

    • Hades 2

    • Fable

    • Hollow Knight: Silksong

    • EVERYWHERE

    • Frostpunk 2

    • Ark 2

    • Metal Gear Solid Δ: Snake Eater

    • Persona 3 Reload

    • Bulwark: Falconeer Chronicles

    • Suicide Squad: Kill the Justice League

    • Pacific Drive

    • Black Myth: Wukong

    • Banishers: Ghosts of New Eden

    • Warhammer Age of Sigmar: Realms of Ruin

    Discover the best games of 2023 at the best prices by checking out the Golden Joystick Awards Steam sale page

    0
    What are your gaming pet peeves?

    First one that comes to my mind is having to travel with an NPC and our walk/run speeds don't match.

    24
    /kbin meta @kbin.social daredevil @kbin.social
    To those genuinely interested in moderating

    @Ernest has pushed an update which allows users to request ownership/moderation of abandoned magazines. Ghost/abandoned magazines were fairly prevalent after the initial wave of hype due to users either squatting magazine names or becoming inactive for other reasons. Now is your chance to get involved, if you were waiting to do so.

    To request ownership/moderator privileges, scroll down to where it says "MODERATORS" in the sidebar. There will be an icon of a hand pointing upwards that you can click on, then make the request. Cheers, and thank you for your hard work Ernest, as well as future mods.

    63
    Final Fantasy VII - Let the Battles Begin! - Remake Evolution [Nobuo Uematsu]

    Title: Let the Battles Begin! Name: Final Fantasy VII Year Released: 1997 Composer: Nobuo Uematsu Developer: Square Enix Platform: PlayStation

    0
    Sonic The Hedgehog OST - Green Hill Zone [Masato Nakamura]

    Title: Green Hill Zone Game Name: Sonic the Hedgehog Year Released: 1991 Composer: Masato Nakamura Developer: Sonic Team Platform: Sega Genesis

    0
    Pokemon Blue/Red - Pallet Town [Junichi Masuda]

    Composer: Junichi Masuda Game: Pokémon Red and Blue (Pokémon Red and Green in Japan) Year Released: 1996 Platform: Game Boy

    0
    Version 4.1 Event Wishes Notice - Phase I
    imgur.com Version 4.1 Event Wishes Notice - Phase I

    Discover the magic of the internet at Imgur, a community powered entertainment destination. Lift your spirits with funny jokes, trending memes, entertaining gifs, inspiring stories, viral videos, and so much more from users.

    Version 4.1 Event Wishes Notice - Phase I
    0
    "Your credible source of the unbelievable truth!"

    "...Euphrasie, three days ago, one of your journalists secretly followed a suspect all the way from the Court of Fontaine to Romaritime Harbor, and almost ended up being tied up and thrown into the sea by a gang of criminals. Whether or not there's any truth in the notion that 'nearer to the action is closer to the truth,' surely Miss Charlotte doesn't value her reports more than she does her own life?" — Yet another exasperated exchange between Captain Chevreuse of the Special Security and Surveillance Patrol and Euphrasie, Editor-in-Chief of The Steambird

    ◆ Name: Charlotte ◆ Title: Lens of Verity ◆ Reporter of The Steambird ◆ Vision: Cryo ◆ Constellation: Hualina Veritas

    Fontaine's famous newspaper The Steambird has a veritable legion of reporters it can call upon, each with their own area of expertise. Some specialize in celebrity gossip, others follow the word on the street, while others still focus on political affairs...

    But among them all, there is one that stands head and shoulders above the rest thanks to her seemingly boundless reserve of energy and perseverance — the inimitable Charlotte.

    Unswervingly committed to the principle that "nearer to the action is closer to the truth," Charlotte has a habit of popping up literally anywhere and everywhere in Fontaine — from its widest avenues to its narrowest back alleys, its highest vantage points to its lowest subterranean vaults, even its tallest mountains to its deepest undersea caverns. She captures the "truth" with her Kamera, records it in her articles, and finally unveils it for all to see.

    And when the "truth" comes out, she's met with a variety of different reactions ranging from applause, to embarrassment, to outright fury. There are even some who would resort to any means necessary to make a particular article connected to themselves disappear. Or alternatively, just make Charlotte disappear.

    For this reason, the newspaper's Editor-in-Chief Euphrasie has on numerous occasions felt the need to distance Charlotte from the Court of Fontaine by sending her off on faraway "field reporting" jobs, only recalling her once the Maison Gardiennage or Special Security and Surveillance Patrol had finally managed to clear things up.

    But despite all this, neither the toil of the job itself nor the pressure of external denunciations and threats has ever phased Charlotte in the slightest.

    With her trusty companion Monsieur Verite by her side, she invariably carries out her journalistic duties with unfaltering fervor, rushing about in pursuit of all the "truths" out there just waiting to be discovered.

    0
    Endless Solo of Solitude -- Regina of All Waters, Kindreds, Peoples and Laws

    One lie always follows another, and so "justice" awaits inescapably at the end. The ignorant see this as some kind of farce. But if they trace back to the source, they inevitably realize that they began by deceiving themselves. — A disordered fable left in someone's dream by Mage "N"

    ◆ Name: Furina ◆ Title: Endless Solo of Solitude ◆ Regina of All Waters, Kindreds, Peoples and Laws ◆ Gnosis: Hydro ◆ Constellation: Animula Choragi

    Undoubtedly, Furina has been much loved by the people of Fontaine from the moment she became the Hydro Archon. Her charismatic parlance, lively wit, and elegant bearing — all bear witness to her godly charms.

    But perhaps the thing that she is most revered for is her unrivaled sense of drama. As the protagonist of a famous play at the Opera Epiclese once put it, "Life is like the theater — you never can tell when the twist will come."

    Furina is as inscrutable as the most cunning of stage characters, her course of action defying all prediction. In fact it's precisely for this reason that the god of Justice and Judgment, unapproachable in her divine majesty, has such a bewitching influence.

    But when the curtain falls, a hollow feeling invariably starts to creep in. There are those who wonder whether there are moments in the dead of night when even a god like Furina feels the sharp pangs of loneliness.

    No, surely not. People couldn't possibly imagine, let alone believe, that such a scene might play out.

    And that's indeed the way it should be.

    That is, were it not for the fact that Furina's tears had already been silently washed away by the Fountain of Lucine.

    1
    What's something that you're grateful for?

    It could be something from today, the past week, or whatever. All things big or small are welcome too. I'm sitting outside today as a a part of my daily routine--it's nice and sunny out, and there's a gentle breeze which feels very relaxing. Doing so is pretty nice between the time I spend at the computer.

    7
    Now that some time has passed, what do you like and dislike about kbin?

    I like that kbin is smaller compared to some lemmy instances. I also prefer the UI. Bigger communities tend to feel a bit overwhelming for me. I also appreciate how transparent Ernest has been regarding kbin's development. That said, it's been a bit challenging to figure out how to utilize some of the federation features that kbin has to offer--microblogging in particular. From what I've seen, people don't generally seem too interested in this feature, but I think it's nice to have.

    23
    Looking to switch distros

    Hi, sorry if this isn't the right place for this question. I've been using Linux Mint Cinnamon for about 9 months now and have also been experimenting with an Ubuntu GNOME Wayland session for the past month or so. I don't really like distro-hopping, but using X11 isn't cutting it for me. After giving GNOME an honest shot, I don't think it's for me. However, Wayland has been stellar. I would prefer to keep using LM Cinnamon, but I have a dual monitor setup that use different refresh rates which has been causing issues.

    I'm interested in Arch, but I'm slightly concerned about the frequent comments regarding things breaking during updates. Also, is maintaining an Arch install heavy on time consumption? I'm not opposed to reading the wiki and spending time here and there to keep things working. However, I'm a bit hesitant if I were to run into an issue that may be more complicated than I may be prepared for. That said, generally I do like the higher skill ceiling options, if that makes sense in this context.

    Tumbleweed seems more beginner friendly from what I've read so far. While I do generally enjoy challenges, having a smoother day-to-day experience does certainly have it's own appeal.

    I would primarily be doing some gaming (this would be a mix of more recent AAA titles along with less demanding ones) and programming, along with the usual stuff you'd expect on a desktop setup. I have a Ryzen 5 3600 processor, an AMD 6650 XT GPU, and 16 gb RAM if that information helps. Thanks in advance; if this isn't the right place, I'll delete the post.

    Update: I have installed EndeavourOS and things have been smooth so far. The installer was very straightforward, and setup was extremely quick. I have started reinstalling various programs which were part of my original workflow with very minimal issues. The issues primarily came from adjusting to pacman syntax. I also have a series of notes regarding what I have installed and how. Cheers, and thanks for your input, everyone. I will be sticking with Gnome for the time being.

    19
    daredevil daredevil @kbin.social

    I'm just an internet explorer.

    日本語 OK • 中文 OK • tiếng việt OK

    @linguistics@cats@dogs@learnjapanese@japanese@residentevil@genshin\_impact@genshinimpact@classicalmusic@persona@finalfantasy

    \#linguistics #nlp #compling #linux #foss

    Posts 20
    Comments 180