I don't blame you re the third party - I wouldn't either. I generally download a transaction file periodically and import it locally using the app. I think you're going to find it difficult to find an API that will allow little people access, even though they are obviously happy to offer that to the big companies. Some of the brokerages have checking accounts and it might be possible to pull the transaction data via the brokers API (maybe), but whichever way you look at it, I suspect the most pragmatic solution is probably going to be a download/import of some kind.
what are you looking to do? I don't know of any consumer bank APIs but most equity and exchange brokerages will let you check account balances and make trades with an API key and credentials. Probably not initiate payments or transfers though. There are too many security risks involved for allowing that via a consumer-level API. There are also tools like Mint that store your credentials and can presumably access your data because they have corporate level agreements with the Financial institutions - I haven't used that and would not normally recommend a corporate-based solution like that personally, but it might work for your needs.
Yes you understand the suggested approach. I don't know about the mariadb tool and if it looks good, by all means use it, but I would offer that the fastest, simplest way to restore a reasonably small database that I can think of is with a sql dump. Any additional complexity just seems like it's adding potential failure points. You don't want to be messing around with borg or any other tools to replay transactions when all you want to do is get your database rebuilt. Also, if you have an encrypted local copy of the dump, then restoring from borg is the last resort, because most of the time you'll just need the latest backup. I would bring the data local and back it up there if feasible. Then you only need a remote connection to grab the encrypted file and you'll always have a recent local copy if your server goes kaput. Borg will back it up incrementally.
for the database, consider a script that does a "mysqldump" of the entire database that you schedule to run on the system daily/weekly. Also consider using gpg to encrypt the plain text file and delete the original in the same script. This is so you don't leave a copy of the data unencrypted anywhere outside the database. You can then initiate either a copy of the encrypted file to a local folder that you're backing up, or if you've set this up to back up directly on the remote that's fine too - bringing it local gives you a staged copy outside the archive and not on the original host in case you need an immediately available backup of your database.
With respect to the 3 separate repos, I would say keep them separate unless you have a large amount of duplicated data. Borg does not deduplicate over different repos as far as I'm aware. The downside of using a single repo is that the repo is locked during backups and if you're running different scripts from each host, the lock files borg creates can become stale if the script doesn't complete and one day (probably the day you're trying to restore) you'll find that borg hasn't been backing your stuff up because a lock file is holding the backup archive open due to a failed backup that terminated due to an untimely reboot months ago. I don't recall now why this occurs and doesn't self-correct but do remember concluding that if deduplication isn't a major factor, it's easier and safer to keep the borg repos separate by host. Deduplication is the only reason to combine them as far as I can tell.
When it comes to backup scripts, try to keep everything foolproof and use checks where you can to make sure the script is seeing the expected data, completes successfully and so on. Setting up automatic backups isn't a trivial task, although maybe tools like rclone and borgmatic simplify it - I haven't used those, just borg command line and scp/gpg in shell scripts. Have fun!
you have the main problem in hand. You'll still need to do all the DKIM / rDNS stuff to be certain your mail is accepted, but using SES as the source gives you a significant leg up vs originating locally. I don't see why you can't run dovecot and postfix on separate systems, but a single VM isn't bad if it's properly secured. Hosting SMTP/IMAP is not that difficult but you need to make sure you don't accidentally misconfigure things and become an open relay - as with all internet facing systems, mail services are targeted constantly so you should use fail2ban to deter them.
this is just a quick script I came up with, but it will show you newest communities and their descriptions. It refreshes daily. maybe it will be helpful for discovering niche communities : https://lemmyfind.quex.cc/
I don't know if it helps or not for me to point this out (I hope it's something that gives you some solace), but depending on the circumstances it's also very difficult to go through an investigation and trial. Maybe things are better now, but 20 or 30 years back it was an ordeal for the victim. The "what were you wearing?" mentality was very prevalent within the male-dominated judiciary and they made it so hard on the victims that they often felt like they were on trial - and in many cases they still didn't get justice either, despite their personal lives being dissected in front of a room full of strangers, some of whom were intent on falsely portraying them as promiscuous. After seeing this happen to a friend, I lost faith in the system to deliver justice. I don't have a solution, but an adversarial system just doesn't seem ideal for this kind of prosecution.
I didn't notice that you were posting from Mastodon as I'm on Lemmy and your posts appear here just like any other Lemmy user - but the @fuser at the start of your messages is probably the tell, I think Mastodon defaults the username you're replying to, whereas Lemmy doesn't. It's great that we can use different applications without some corporate gatekeeper capturing everybody's personal info at the integration point to hawk to an advertising company.
Well, thanks again for the info - I'm trying it now and the results seem excellent, it took me to wikiwand, which I'd never used but it's a front end for wikipedia - it's quite nice. I've learned so much about alternative FOSS and great ad-free content by reading and posting here. I was never a great fan of reddit - liked to scroll but hardly ever posted there - I thought RPAN was the coolest thing they did - but Lemmy is great for conversation, despite the relatively small user base - I'm grateful that reddit's nonsense drove so many helpful people here.
The best part is watching these idiots blow their fortunes thinking they are going to continue building on the old paradigm of monolithic platforms when the ground is gradually shifting towards diversification via decentralization and they are behind the curve now, not in front of it. This is not your dad's internet. Hopefully they continue splashing out huge amounts of cash in ill-fated efforts to prove they are still relevant. There's no fool like an old fool - and old, rich technically-out-of-touch fools who lack the self-awareness to stop imagining they are hip are particularly amusing.
everyone has their price. with unlimited funds he should be able to make something pretty slick for whatever the vision is - not that I would ever use it, of course. But don't imagine that software engineers won't compromise their politics if the money is good. Given the necessity of income, you can rationalize anything if you really need to.
I don't blame you re the third party - I wouldn't either. I generally download a transaction file periodically and import it locally using the app. I think you're going to find it difficult to find an API that will allow little people access, even though they are obviously happy to offer that to the big companies. Some of the brokerages have checking accounts and it might be possible to pull the transaction data via the brokers API (maybe), but whichever way you look at it, I suspect the most pragmatic solution is probably going to be a download/import of some kind.