I watched all of that and I still don't get it
Does Google Cloud not count as “own hardware” for google?
That's why the bars are so different. The "cloud" price is MSRP
That's not really machine learning though. If you wanted to go way back, AI research goes back to implementations of hebbian learning in computer science back in the 1950s as a way of emulating human neurons. I was merely pointing out that AI was a computer science "dead end" until restricted Boltzmann machines were revisited by Hinton et al back in 2008 or so, and that 99% of the growth in the field has happened since the early 2010s when we reached a turning point where deep learning models could actually outperform classical statistical models like regression and random forests
in the coming decades
Given that in the past 15 years we went from "solving regression problems a little bit better than linear models some of the time" to what we have now, it's not unfounded to think 15 years from now people could be giving LLMs access to code execution environments
Pagers operate at a lower broadcast frequency than cell phones. Longer wavelengths (low frequency) are less impeded by walls and interference.
This is only true for steam keys sold on other platforms afaik
Nothing more beautiful than seeing transparent yellow-orange overlaid on top of transparent orange-yellow
What (widely popular) race could possibly be a better metric of endurance than the marathon?
It's not uncommon to see certain sites to only work on chromium because the dev used the filesystem APIs that don't exist on FF
Yes but if it's first instinct is "go left" on 1-2, it's pretty apparent the reward function could use some tuning
It's not necessary but there is no reason not to.
Pros:
- production and development programs are more similar
- upgrading your base image won't affect your python packages
- you can use multi stage builds to create drastically smaller final images
Cons:
- you have to type
venv/bin/python3
instead of justpython3
in the run line of your dockerfile
When you teach a child what a dinosaur is, you have to do a lot more explaining than when you try and teach an adult what a dinosaur is in french - the child isn't just learning a language for those 10 years.
COVID has long lasting effects across your entire body, it impacts your immune and cardiovascular systems for years after you get it.
Risks of heart attack, stroke, etc are increased 2.5x in the year after infection, increasing exponentially with the number of times you have been infected. In fact, the risk of pretty much ANY health problem skyrockets following a COVID infection.
So I would say that if you’re explicitly trying to use Python, a Pi is the way to go.
I will point out you can run micropython on a lot of embedded boards now. I haven't used it, so I don't know if it's actually good or if it's more like those software-gore "here is my python package for building web front ends that is somehow worse than JS" packages you always see on python boards
Yes, but if someone trips over the cord there is a 50% chance the wrong side comes unplugged and potentially kills them, hence why they don't make these cords
You gotta donate to planned parenthood for every dollar spent there. It's like buying carbon offsets, but for sandwiches. /s
This seems more like a collection of examples than an actual attempt at a definition.
At its core, AI is a program that takes a given input and returns the output that, during it's training phase, would be expected to minimize it's error (or maximize it's reward).
More, but not way more - they would be licensing window IoT, not a full blown OS, and they wouldn't be paying OTC retail rates for it.