This should be illegal, companies should be forced to open-source games (or at least provide the code to people who bought it) if they decide to discontinue it, so people can preserve it on their own.
That's the horrible thing about online services. You never really own it, it can be taken away from you at any time. If you want to preserve something, you need physical and/or offline access.
And in addition to that sentiment, compression from moving or sending a copy of a copy is known to very slowly degrade digital media, so physical is almost always preferred.
Sure, it's possible, but it's unlikely. A properly kept laserdisc compared to, for example, a YouTube Video isn't even a competition. Physical media not exposed to radiation or impact can last decades if not centuries. Don't even get me started on Vynil.
Literally every seeder is part of that archive. You can look at individual trackers in the microcosm as individual archives and indices, but it's the culture of piracy that causes the wide scale collection and preservation of media.
We're actually at this kind of interesting cross-generational point of guerilla archival where it's become easier to find certain obscure pieces of media history. I suspect this is in large part due to things like bounties, where suddenly a forgotten VHS of a 35 year old HBO special that aired once or twice could be a step toward a higher rank and greater access to a wider range of media.
Modern piracy has a strong incentive toward finding lost material that's no longer readily available. Zero day content is great, but have you seen the RADAR pilot or both seasons of AfterMASH?
They belong in a museum. Indie would be proud, even if Harrison wouldn't. Not that I know his perspective on piracy.
Compression and transmission of data causes loss of parity. We lose or flip some 1s and 0s. Over time the effects become very noticeable. The best visual example I can think of are experiments where YouTubers downloaded and reuploaded their own video 100 times, it very quickly degrades. In a more reasonable scenario, near lossless file types and compressions would degrade much more slowly.
experiments where YouTubers downloaded and reuploaded their own video 100 times, it very quickly degrades
That just means Youtube's software uses lossy compression, that is a Youtube problem, not a digital media problem. Are you familiar with the concept of file hashing? A short string can be derived from a file, such that if any bit of the file is altered, it will produce a different hash. This can be used in combination with other methods to ensure perfect data consistency; for example a file torrent that remains well seeded won't degrade, because the hash is checked by the software, so if anyone's copy changes at all due to physical degradation of a harddrive or whatever other reason, the error will be recognized and routed around. If you don't want to rely on other people to preserve something, there is always RAID, a 50 year old technology that also avoids data changing or being lost assuming that you maintain your hardware and replace disks as they break.
Here's the fundamental reason you're wrong about this: computers are capable of accounting for every bit, conclusively determining if even one of them has changed, and restoring from redundant backup. If someone wants to perfectly preserve a digital file and has the necessary resources and knowledge, they can easily do so. No offense but what you are saying is ignorant of a basic property of how computers work and what they are capable of.
It's the most obvious example of a digital media problem. Computers might be able to account for every bit with the use of parity files and backups with frequent parity checks, but the fact is most people aren't running a server with 4 separately powered and monitored drives as their home computer, and even the most complex system of data storage can fail or degrade eventually.
We live in a world of problems, like the YouTube problem, compression problems, encoding problems, etc. We do because we chose efficiency and ease of use over permanency.
Computers might be able to account for every bit with the use of parity files and backups with frequent parity checks
Yes, and this can be done through mostly automatic or distributed processes.
even the most complex system of data storage can fail or degrade eventually.
I wouldn't describe it as complex, just the bare minimum of what is required to actually preserve data with no loss. All physical mediums may degrade through physical processes, but redundant systems can do better.
but the fact is most people aren’t running a server with 4 separately powered and monitored drives as their home computer
It isn't hard to seed a torrent. If a group of people want to preserve a file, they can do it this way, perfectly, forever, so long as there remain people willing to devote space and bandwidth.
We live in a world of problems, like the YouTube problem, compression problems, encoding problems, etc. We do because we chose efficiency and ease of use over permanency.
All of these problems boil down to intent. Do people intend to preserve a file, do they not care, do they actively favor degradation? In the case of the OP game, it seems that the latter must be the case. Same with Youtube, same with all those media companies removing shows and movies entirely from all public availability, same with a lot of companies. If someone wants to preserve something, they choose the correct algorithms, simple as that. There isn't necessarily much of a tradeoff for efficiency and ease of use in doing so, disk space is cheap, bandwidth is cheap, the technology is mature and not complicated to use. Long term physical storage can be a part of that, but it isn't a replacement for intent or process.
I wouldn’t describe it as complex, just the bare minimum of what is required to actually preserve data with no loss. All physical mediums may degrade through physical processes, but redundant systems can do better.
I think you didn't read correctly on the statement about the most complex system failing. I'm not saying that is the most complex system, I am saying the most complex system will fail.
It isn’t hard to seed a torrent. If a group of people want to preserve a file, they can do it this way, perfectly, forever, so long as there remain people willing to devote space and bandwidth.
LMAO at the idea of comparing every bit of every portion of every seeder's copy with each other simultaneously and then cross referencing every parity file to be doubly safe, and then failing to see the chance of loss of parity during transmission of said files even after that. I will admit it would take a lot longer for a torrented file to degrade than some other forms of file distribution, but it's not going to last for a thousand years, mate.
And I am saying complexity has little to do with it and also that a system can exist that will not fail.
it’s not going to last for a thousand years
Specifically why not? What is unrealistic about this scenario, assuming enough people care to continue with the preservation effort? All nodes must fail simultaneously for any data to be lost. The probability of any given node failing at any given time is a finite probability, independent event. The probability of N nodes failing simultaneously is P^N. That is exponential scaling. Very quickly you reach astronomically low probabilities, 1000 years is nothing and could be safely accomplished with a relatively low number of peers. Maybe there are external factors that would make that less realistic, like whether new generations will even care about preserving the data, but considering only the system itself it is entirely realistic.
The best visual example I can think of are experiments where YouTubers downloaded and reuploaded their own video 100 times
This has nothing to do with copying a file. YouTube re-encodes videos whenever they are uploaded.
A file DOES NOT DEGRADE when it is copied. That is something that happened to VHS and cassette tapes. It does not happen to digital files. You can even verify this by generating a hash of a file, copy it 10,000 times, and generate a new hash and they would be 100% identical.
No I won't be, because I've done this before for various reasons, but not a single but was changed.
Let me put it this way. A computer stores programs and instructions it needs to run in files on a drive. These files contain exact and precise instructions for various components to operate. If even a SINGLE bit is off in just a couple of the OS files, your computer will start throwing constant errors if not just crashing entirely.
And this isn't just theory. It's provable. Cosmic rays have been known to sometimes hit a drive and cause a bit-flip. Or another issue is a drive not being powered on for a long time causing bit-rot
At this point I'm starting to think you're a troll. There's no way someone believes what you're saying.
Then you're not a troll, just completely deluded and frankly stupid. You've been getting so many genuine responses trying to help you learn, but you keep digging in your heels and doubling down on being confidently wrong.
Believe whatever you want, just keep it to yourself.
They want to "help me learn" that a form of media storage invented and refined within a couple of decades will outlast all other forms, because they've deluded themselves that the things they rely on are perfect and that failure is impossible.
You're referring to a video codec degrading as it keeps rendering the video again, not just copying and pasting the bits. There is no degradation from copying and pasting a file as-is.
And when you download the processed video and reupload it, it's a 1 to 1 conversion of the same video codec, and every generation it gets worse. That example is a low hanging fruit, but the concept applies to everything.
That 1:1 conversion through the same codec is very likely lossy.
However that's not a straight file copy which is what you originally said causes degradation.
Literally every file distribution method compresses the media first. A better argument was that YouTube re-encodes the video during the re-upload with a particularly lossy method to save on bandwidth and server space.
If you use most digital formats for media and compress them with something like .7z or Winrar, then it might take years or decades to noticeable degrade, but it is still a matter of when not if.
Holy crap. File compression is not the same thing as lossy media compression.
File compression uses mathematical algorithms to create definable outcomes. Meaning it doesn't matter how much you compress/uncompress a file, it will always be exactly the same.
5 X 2 will always give you 10 and 10 ÷ 2 will always give you 5.