Copied from an earlier comment of mine.
Prompt: Is Lumo training data its own, or is it based of other preestablished data?
Response: "Lumo is powered by several open-source large language models (LLMs) that have been optimized by Proton. The models used include Nemo (Mistral), OpenHands 32B, OLMO 2 32B (Allen Institute for AI), and Mistral Small 3 (Mistral). These models run exclusively on servers controlled by Proton, ensuring that all conversations remain private. Proton does not contribute any user data to the training of these models, maintaining a strong focus on user privacy."
This is what I got from Lumo.
Been using Piefed on the Interstellar app, its pretty nice. As other have said, the collecting of seemingly duplicate posts into one is great. Overall it doesn't feel much different on an average use case.
Prompt: Is Lumo training data its own, or is it based of other preestablished data?
Response: "Lumo is powered by several open-source large language models (LLMs) that have been optimized by Proton. The models used include Nemo (Mistral), OpenHands 32B, OLMO 2 32B (Allen Institute for AI), and Mistral Small 3 (Mistral). These models run exclusively on servers controlled by Proton, ensuring that all conversations remain private. Proton does not contribute any user data to the training of these models, maintaining a strong focus on user privacy."
This is what I got from Lumo.
Maybe I read that wrong, but Cranky was the OG DK and Cranky is the one who kidnapped Pauline in the 1981 arcade game. Our boy DK in the picture is his grandson.
https://mario.fandom.com/wiki/Cranky_Kong
GoT comes to my mind with this. The whole show are these giant buildup to the war in the north, the dragons, Westeros power grabs, Jamie's character arc, children of the forest, and other cool concepts that ended in a pathetic wet fart of a final season. It makes watching the show feel like a waste. when you know at the end its just a wet fart.
Fair enough.