I'm writing this from a crappy laptop with 2GB of RAM and a dull screen.
I'm writing this from a crappy laptop with 2GB of RAM and a dull screen.
I'm writing this from a crappy laptop with 2GB of RAM and a dull screen.
Resources are just way cheaper than developers.
It's a lot cheaper to have double the ram than it is to pay for someone to optimize your code.
And if you're working with code that requires that serious of resource optimization you'll invariably end up with low level code libraries that are hard to maintain.
... But fuck the Always on internet connection and DRM for sure.
If you consider only the RAM on the developers' PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that's pretty desastrous from a sustainability point of view.
But that's just more business!
Last time I checked - your personal computer wasn't a company cost.
Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.
As a developer, my default definition of “slow” is whether it’s slow on my machine. Not ideal, but chimp brain do chimp brain things. My eyes see my own screen all day, not yours.
You can also build a chair out of shitty plywood that falls apart when someone who weighs a bit more sits on it, instead of quality cut wood. I mean, fine if you want to make a bad product but then you’re making a bad product.
Resource optimization has nothing to do with product quality. Really good experiences can be done with shitty resource consumption. Really bad experiences can be blisteringly fast in optimization.
The reason programmers work in increasingly abstract languages is to do more with less effort at the cost of less efficient resource utilization.
Rollercoaster Tycoon was ASM. Slay the Spire was Java. They're both excellent games.
It’s a lot cheaper to have double the ram
yeah a lot cheaper to force someone else to buy double the RAM. No thanks.
Companies don't pay for your 2x RAM and it doesn't slow down their user acquisition so they don't care.
Reminds me of a funny story I heard Tom Petty once tell. Apparently, he had a buddy with a POS car with a crappy stereo, and Tom insisted that all his records had to be mixed and mastered not so that they sound great on the studio's million dollar equipment but in his friend's car.
That's how my professors instructed me to mix. To make it sound as good on shitty speakers as possible and also sound good on expensive systems.
Reminds me of the ass audio mixing in movies where it is only enjoyable in a 7.1 cinema or your rich friends home theater but not on your own setup
It seems we've lost sight of reality there.
As we don't intend to attend much cinema any more, I hope they bring back essentially a Dolby Noise Switch for movies. I don't want to sacrifice too much, but booming noise followed by what comes out as whispered dialogue really cheapens the experience.
I hope they can find a process that gives us back a sound track for the sub-17:7 sound system.
I had the same exact approach back in the late 90's. My friends had several band projects and when they were mixing their demos, I insisted that if the mixes sound good in a standard car stereo, they'll sound good anywhere.
This is still a perfectly sound method.
Getting the music you made in your own DAW to sound good on your home speakers is almost easy. getting it to not suck on shitty speakers? that's an art.
Mr. Petty is a wise man.
Then again my 2016 stock yaris had the best sound I ever heard anywhere.
Most of the abstractions, frameworks, "bloats", etc. are there to make development easier and therefore cheaper, but to run such software you need a more and more expensive hardware. In a way it is just pushing some of the development costs onto a consumer.
Most of the abstractions, frameworks, “bloats”, etc. are there to make development easier and therefore cheaper
That's true to an extent. But I've been on the back side of this kind of development, and the frameworks can quickly become their own arcane esoteric beasts. One guy implements the "quick and easy" framework (with 16 gb of bloat) and then fucks off to do other things without letting anyone else know how to best use it. Then half-dozen coders that come in behind have no idea how to do anything and end up making these bizarre hacks and spaghetti code patches to do what the framework was already doing, but slower and worse.
The end result is a program that needs top of the line hardware to execute an oversized pile of javascripts.
But this does not neccesarily mean the consumer pays more. Buying a current mavhine and having access to affordable software seems like a good deal.
Capitalism makes it work only in one direction. Something became cheaper? Profits go up. Sometging became more expensive? Prices go up.
If the software is much more expensive to develop, most is it just won't exist at all. You can get the same effect by just not using software you feel is bloated.
Reminds me of the UK's Government Digital Services, who want to digitise government processes but also have a responsibility to keep that service as accessible and streamlined as possible, so that even a homeless person using a £10 phone on a 2G data service still has an acceptable experience.
An example. Here they painstakingly remove JQuery (most modern frameworks are way too big) from the site and shave 32Kb off the site size.
That's the most professional comment section I've ever fucking seen.
Website is amazingly responsive as well, seems to be working.
Hasn't been linked to reddit yet probably.
Getting away from reddit has shown me that there are unspoiled places in the digital world out there, communities of people who actually care about the topic and not performatism and internet attention.
The issue with UK services is that they all are fucking random and plenty of sections don't work. There are billions of logins, bugs and sometimes you just get redirected to some bloody nightmare portal from 1990-s. And EU citizens couldn't log in into HMRC portal for years after Brexit, what a fucking joke! And all they do is spend time removing jQuery, good fucking job!
Contentious in the comments!
At a certain point it makes more sense to subsidize better low-end hardware than to make every web site usable on a 20 year old flip phone. I'd argue that if saving 32 kB is considered a big win, you're well past that point. Get that homeless guy a £50 phone and quit wasting the time of a bunch of engineers who make more than that in an hour.
Get that homeless guy a home.
Also, if you are in a basement/mountains/middle of Siberia, waiting for 32 kB takes quite some time.
Also, engieneers already had tech debt of updating to new jQuery version, which can result in a lot of wierd bugs, so it was achiveing two goals at once.
And probably 50£ phone IS their target device.
When my dad died suddenly in 2015 and I cleared out his office at his job, I spun down his Win95 machine that he'd been using for essential coding and testing. My father was that programmer—the one who directly spoke to a limited number of clients and stakeholders because he had a tendency to ask people if they were stupid.
Your dad sounds like the childhood hero of mine who got me into computers.
Severe ADHD prevented me from ever learning to code, but I became damn good at repairs and things and just general understanding of computers because he was available to ask questions at almost any time.
He went to school auctions every year and got me a pile of hardware to learn from. He never asked for anything in exchange. All around great guy.
I heard him on the phone a few times dealing with the people who he worked with though. Good god he was mean. I couldn’t imagine him being that way with me ever, but he was brutal when it came to work and money.
A dude called him one time while I was sitting there, he listened for a few minutes and he said, “I’ve got a 14 year old kid here, he’s been doing this stuff for about 2 years. I’m gonna let him walk you through this for the 10th fucking time because you’re a goddamn idiot and feeling like a fool when you hang up the phone with a grown man isn’t teaching you any lessons. Maybe get a pen for this one because if I have to remind that a child walked you through it last time, I’m not going to be so fucking friendly.” I was so nervous, apologized multiple times, when I was finished walking him through it he took the phone and said, “now don’t you feel stupid? 25 years and this kid just schooled you.”
He told me, “you gotta be real with idiots or they’ll bother you with stupid problems every single day of your life.”
I wish that lesson had stuck haha, it just wasn’t in me to be mean. As a result, a hobby that I was passionate about all of my life is something I avoid like the plague now. People ruined it for me by bothering me constantly.
When you see what ONE coder was able to do in the 80s, with 64K of RAM, on a 4MHz CPU, and in assembly, it's quite incredible. I miss my Amstrad CPC6128 and all its good games.
Still happens.
Animal Well was coded by one guy, and it was ~35mb on release (I think it's above 100 at this point after a few updates, but still). The game is massive and pretty complex. And it's the size of an SNES ROM.
Dwarf Fortress has to be one of the most complex simulations ever created, developed by two brothers and given out for free for several decades. The game, prior to adding actual graphics, DF was ~100mb and the Steam version is still remarkably compact.
I am consistently amazed by people's ingenuity with this stuff.
SNES ROMs were actually around 4MB. People always spoke about them being 32 Meg or whatever, but they meant megabits.
I did like Animal Well, but gave up after looking at one of the bunny solutions and deciding I didn't have the patience for that.
I think most of the size of games is just graphics and audio. I think the code for most games is pretty small, but for some godforsaken reason it's really important that they include incredibly detailed doorknobs and 50 hours of high quality speech for a dozen languages in raw format.
When you see what they did in the 60s and 70s, where they ran an entire country's social security system in a mainframe with a whooping 16Kb of memory (I'm not sure if it was 4 or 16, but it doesn't make that much difference).
Doesn't really matter what your developers run on, you need your QA to be running on trash hardware.
We can even cut out the middleman and optimize unity and unreal to run on crap
Jokes on you, my corporate job has crippled the Mac they gave us so much that EVERYONE has trash hardware!
I can think of a few games franchises that wouldn't have trashed their reputation if they'd have had an internal rule like "if it doesn't play on 50% of the machines on Steam's hardware survey, it's not going out"
I think it's given us a big wave of "Return to pixelated tradition" style games. When you see 16-bit sprites in the teaser, you can feel reasonably confident your computer will run it.
I don't mind if indie devs try something experimental that melts your computer. Like beamNG needs a decent computer but the target audience kinda knows about that sort of stuff.
The problem is with games like cities skylines 2. Most people buying that game probably don't even know how much RAM they have, it shouldn't be unplayable on a mid range PC.
Im making a 3D game now, and the goal is to get it running on a gtx 660
And it plays at 5-15 fps like Rabi-Ribi.
Unless they use Unreal Engine and don't know what they are doing. It can be pixely and run like ass.
Octopath Traveler was the last UE based game that really ran well that I can remember.
I knew someone that refused to upgrade the programmer's workstation precisely because it would have been a big leap in performance compared to what their costumers used the software on. Needless to say the program was very fast even on weaker hardware.
That someone is me.
We need more shorter games, made by happier devs paid more to work fewer hours, with worse graphics.
I need a glass of whiskey.
Don't we all, though I'm not opposed to any good liquor
Steam is full of shorter games with worse graphics made by indy devs. Guess what? No one gives a shit! Because no one needs crappy games from 1980-s.
I quite like many games with "poor" graphics. Perhaps not exclusively, but you're seriously missing out if you only go for realistic-looking or detailed games. Give a few of those indie games a try, you might be surprised.
Edit: Oh, and terminal games are cool! Usually not very performant though.
Some of them are really succesful. Many people care. Others don't.
Here the current Steam charts. Many indie games, some few with really low specs. Banana only needs 30MB RAM. Seems to be a great game. Hostly now, why are 50k people playing that "game" currently?
But back on topic: Yes, AAA games are more succesfull and earn much more money, but claiming "no one cares about indie" is stupid, when so many people play games like Rust, Stardew Valley, Prison Architect, Terraria, RimWorld, Valheim, The Forest, ...
The indie boomer shooter scene is exploding, actually
The ideal is “plays fine at lowest graphics settings on old hardware” while having “high graphics settings” that look fantastic but requires too-of-the-line hardware to play reasonably.
Generally this is almost impossible to achieve.
So you're saying there IS a chance...
But .. where is the innovation (and also Alt text?)
Probably an innovative revelation of the concept of "bloat".
Image description.
The image is a screenshot of a tumblr post by user elbiotipo.
My solution for bloatware is this: by law you should hire in every programming team someone who is Like, A Guy who has a crappy laptop with 4GB and an integrated graphics card, no scratch that, 2 GB of RAM, and a rural internet connection. And every time someone in your team proposes to add shit like NPCs with visible pores or ray tracing or all the bloatware that Windows, Adobe, etc. are doing now, they have to come back and try your project in the Guy’s laptop and answer to him. He is allowed to insult you and humilliate you if it doesn’t work in his laptop, and you should by law apologize and optimize it for him. If you try to put any kind of DRM or permanent internet connection, he is legally allowed to shoot you.
With about 5 or 10 years of that, we will fix the world.
Innovation is orthogonal to code size. None of the software most modern computers are running cannot be solved on 10 year old computers. It's just the question whether the team creating your software is plugging together gigantic pieces of bloatware or whether they actually develop a solution to a real problem.
I think that every operating system needs to a have a "do what the fuck I told you to" mode, especially as it comes to networking. I've come close to going full luddite just trying to get smart home devices to connect to a non-internet connected network, (which of course you can only do through a dogshit app) and having my phone constantly try to drop that network since it has no Internet.
I get the desire to have everything be as hand-holdy as possible, but it's really frustrating when the hand holding way doesn't work and there is absolutely zero recourse, and even less ability to tell what went wrong.
Then there's my day job, where I get do deal with crappy industrial software, flakey Internet connections and really annoying things like hyper-v occupying network ports when it's not even open.
You could make your own smart devices. You don't even need to be smart in embedded systems these days either. Just use a cheap SBC.
I try to not buy any Wi-Fi smart home devices anymore. I try to stick to zwave or zigbee, zwave I have better luck with generally. I even left my nest thermostat at my old house and installed a 10+ year old zwave thermostat at the new one. Way happier. I'm not relying on googles API to be stable anymore for home assistant interaction.
Just use Linux?
The thing is that developers tend to keep things as simple as possible and overly optimize stuff, when you find bloatware is usually some manager that decided to have it.
It's the marketing. Always the marketing. Especially the SEO guys.
One SEO guy we worked with told us not to cache our websites because he was convinced that it helped. He badgered us about it for weeks, showed us some bullshit graphs and whatever. One day we got fed up and told him we'd disabled the cache and he should keep an eye out for any improvements in traffic. Obviously we didn't actually do anything of the sort because we are not fucking idiots. Couple days later the SEO wizard sent us another bunch of figures and said "see, I told you it would help I know my stuff". He did not, in fact, know his stuff.
Couple days later the SEO wizard sent us another bunch of figures and said “see, I told you it would help I know my stuff”. He did not, in fact, know his stuff.
Ahaha no way.
The thing is
Of course, we developers like to optimize and patch source code all the time. If I am suddenly woken up at three in the morning, I will immediately open the lid of my laptop and start optimizing the code. That's our little developer secret.
If I can't type the program into my TRS-80 from a computer magazine I don't trust it.
I was going to laught at the hypocricy of you using lemmy, definitely not using your TRS-80 but I don't trust my phone, the lemmy app or lemmy server either.
Trash 80 dialing in to a Linux shell account using one of the various cli lemmy clients should work
We just need to write a version of Lemmy that runs in basic on a TRS-80 and publish a magazine
I'm training to work in hardware currently. Its my hope that there at least, people still care about min-maxing power vs performance.
My understanding is that hardware companies usually alternate generations: one for performance, one for power. It seems like this is the balance that makes the market happy.
This is the way. Most of the games today run as shit because people doesn't know or care about computer resources management.
Better to run a whole generation , so like 30 years so people would start planning the upgrades ahead for when everyone is ready
I think I already posted this at some point, but Software Disenchantment is always worth mentioning in this context.
But how would you implement that new Microsoft Screenshot surveillance bullshit feature? Just imagine what a giant waste of resources that is. You have something on your screen which is information and mostly likely already in a good form to process like text. But it makes a screenshot every few seconds and uses some "AI" to make the already existing information searchable again from a fucking screenshot??? Maybe I missed something but that is how I understood the feature.
If it was for surveillance, do you really think they'd tell you about it?
Stop using JS/Node for even brewing your coffee and see this problem resolves itself.
Spring 5 has WebFlux, which runs on top of Netty. This is usually how I heat my home.
Antix made my old Chromebook’s usable. Old tech is fun.
Antix is the last one standing on the support of old hardware, also gentoo Debian and tumbleweed is good ones since they support WIDE range of architectures
It's not that hardware isn't capable, it's just manufacturer isn't willing. Or rather willing to not.
How goes the saying? 32 MB of RAM and always swapping?
I've heard Emacs as eight mb and constantly swapping (I use Emacs BTW)
Your awesome for correcting me (I use vim BTW)
Plus emacs and lisp are superior. But I didn't managed to jump ships.
I 100% agree. But, where Linux?
Can current Windows even work with 2GB of RAM?
Yep, minimum win 10 1gig cpu 1 gig ram (32 bit) 2gig ram (64 bit) just don't expect much out of it lol
In 1000 years this meme/tweet/post will be what my entire generation's existence will be known for. Noone will remember the politics, the disasters, the geopolitical events good or bad, they will remember our entire world and existence ad the only time that technology advancement was driven by the big tech mafia trying to see how far it can get it's dick in your digital footprint.
It's the new cops v robbers or bootleggers v prohibition race. Our tech is getting faster to out run the corporate fuckin maleare but the faster we go the more they stuff in so to the avg user they're ended with paying $6k for a GPU/cpu combo that runs at the same efficiency as my school library's c9mputer did running ms-dos running Oregon Trail in 1995. You are so confined by only having access to functions with massive fuckiing app buttons that even logging in as a guest user req you to memorize every CLI ever made.
It's become my defining "I don't want to live in this world anymore"
You're provoking my alcoholism. I reread your comment three times and here I am on my way to the liquor shop.
I make sure my own web game can run smoothly on crappy hardware. It runs well on my gaming laptop downclocked to 400MHz with a 4x slowdown set by Chrome. It also loads in a couple seconds with a typical crappy Internet connection of 200kbps and >10% packet loss. However, it doesn't run smoothly on my Snapdragon 425 phone or my old Core 2 Duo laptop. Is this my game or just browser overhead?
You have my vote.
that's why I've been doing most of my gamedev stuff on an old craptop from 2016.
performance issues become apparent immediately
Can you even run Windows with just 2gb?
You could probably run Windows 7 without any issues at all.
Yes it can!
Tiny11 is a stripped down custom build of Windows 11, which only requires 8 GB of storage and 2 GB of RAM.
Someone even got it to run on 200MB of RAM.
I really like the way Ameliorated/AME Wizard handles the debloating. You take a Windows ISO and install like usual, then run AME with a playbook (like AtlasOS), which strips out the bloat through a collection of scripts . AME Wizard is open source, and you can directly inspect all the scripts within the playbook, whereas Tiny11 is a whole ISO that is hard to verify. Not saying that I can personally vouch that it is completely trustworthy, as I have only taken a brief look at the code and scripts, but I like to have the option. It also means that I could modify out any changes I don't like.
I found out about AME Wizard when I had to reformat a MiPad2 tablet with 2gb of RAM, and so far it has worked better than when the tablet was new. The only downside is that you go through the full Win 11 install, so you need enough available space and then reclaim the wasted space after, but it is at least mostly automated.
Do you want to run Windows?
No, but I also don't want to only have 2GB XD
It depends on the version. I ran windows 98 on 64MB, then I upgraded and ran windows XP on 1GB for quite a while before switching to Linux.
Sure (with a bit of effort). Can you run Windows software with just 2GB? Now that's a completely different problem.
No, pure DOS only
Nexus OS windows repack is using 400mb ram on startup
I like how KDE has been getting faster and faster as time went on. Like Lisa Simpson's perpetuum mobile.
Hell yeah, I'd be getting paid for shouting at people
can we get this law passed pls
Yeah, you'd better hurry.
Hey I know that guy he's me
No, it wasn't me.
I wrote an email service called Port87, and I did it on a really low end laptop (an Ideapad 3 from 2021) to make sure that it works well, even on a potato.
If an Ideapad 3 is a potato now what's an EeePC?
Yeah, screw CEF, Electron, and webdevs who can't live without those.
I would love to be this person. I'll even fix some of the bugs I find.
This is like the definition of a "conservative". Progress shouldn't happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn't be allowed.
Most games have the ability to downscale so that people like this can still play. We don't stop all progress just because some people aren't comfortable with it. You learn to adjust or catch up.
More "conservative" in terms of preserving the planet's resources.
You don't need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.
I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that's ridiculous.
On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it's not a problem to have those features.
It's conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.
less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.
I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.
If they can downscale enough, they should be able to pass this test.
One could point to the inclusive or environmental aspect to this approach.
Honestly we are hitting the bugetary limits of what game graphics can do, for example.
A lot of new games look substantially worse than the Last of Us Part 2, which ran on ancient hardware.
"Limitations foster creativity."
100% agree. But there's no reason to limit innovation because some people can't take advantage of it. Just like we shouldn't force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000's....that's two decades ago.
I'm still gaming on my desktop that I built 10 years ago quite comfortably.
Somebody didn't live though the "Morrowind on Xbox" era where "creativity" meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.
But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn't remove them from the game, creating huge bloat in your save file.
Not all creativity is good creativity.
The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware's veins. It is not innovation. It is simply a waste of precious resources.
i played Skyrim on i3 4005u with integrated graphics and 4gb ram when i was high schooler, they did epic in 7th console generation as limitations was 512mb shared memory and 250gflops gpu
You're a heretic.
By the way, I only threw that picture because I liked the background color. I didn't read the text itself.
Kinda has a point...
Yes, this would work with most applications.
lol.
Yeah, well, I have 1.8Gb so 😏
Get yourself a drink now!
You sound like some guy screaming everyone should own a horse after the car became popular.
I think that analogy oversimplifies according to the assumption that one is inherently better than the other. OP's point here is that it isn't all better at all.
I think a more accurate analogy would be that the OP is screaming that horse trails, ranches, and farms are being shut down because they don't accommodate cars.
Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won't anybody think of the poor oligarchs?!?
https://media.tenor.com/dYLvyNRhV8sAAAAC/woody-harrelson-crying.gif
FOR THE LOVE OF GOD, THINK OF THE SHAREHOLDERS!