![]() ![]() But if it's wet outside does not mean it's rained. But a website and all your extensions on every website? Those can ship new code at any time. I can understand trusting a browser or a software release that can be tested by many people, and was signed with a checksum. But then we have this from just a few months ago: you may say that this was only when Mark Z was a young man and now Facebook the company is far more responsible. In a world where we are sending our Alexa data to “the cloud” and the companies are admitting people are listening to it, in a world where Facebook secretly records everything it can, why would you assume your password isn’t being sent? In other words, Mark appears to have used private login data from TheFacebook to hack into the separate email accounts of some TheFacebook users. If the cases in which they had entered failed logins, Mark tried to use them to access the Crimson members' Harvard email accounts. Then he examined a log of failed logins to see if any of the Crimson members had ever entered an incorrect password into. Mark used his site,, to look up members of the site who identified themselves as members of the Crimson. How did he do this? Here's how Mark described his hack to a friend: Instead, he decided to access the email accounts of Crimson editors and review their emails. ![]() Those dumbfucks.”Īnd it’s not just mere words, here is he actually set up a honeypot site to get people’s passwords and break into their emails to satisfy his burning curiosity when he first launched Facebook: Apple pinky swears they won’t do that, and all your browser extensions running all the time do, too.Īs Mark Zuckerberg once opined: “They ‘trust’ me. Well, you just have to “trust” the server to not serve a website that will phone home your password. Not sure if rclone has commands/flags to help with this that I simply don’t know about. Using versioning with crypt would probably be a pain too due to the file names. With Borg this is built in and you can change the underlying storage and migrate without loosing any data. This could help in this case, but it depends on the remote. With rclone some remotes have versioning (Google Drive, Dropbox). Borg detects but doesn’t have to store it again. If you move directories or files inside, rclone has to reupload them. The size it takes should be close to the amount of data changed in the whole source. So with Borg, if you create a backup 1 of `~/Documents` today and a backup 2 tomorrow of `~/Documents` you can see both backups and work with each snapshot. It does that efficiently by deduplicating file (chunks really) even if they’re not in the same location. Anything deleted, the next time it syncs, gets deleted.īorg is a backup tool. This is a simple backup since you only have one version. If you `sync` `~/Documents` to your remote it will keep an exact copy. This is a new Synology NAS - the slim one with 6 2.5inch drives - so it's slow, but even during the verify in background init (which will take 5-7 days), can still get the expected 100MB/s with synthetic performance microbenchmarks.ġ 08:21:51 PDT Total scanned: 7,326.083 GB ( 735,922 files)ġ 22:25:18 PDT Total scanned: 6,786.Rclone simply copies data. So, any other performance data points or perhaps a different way to frame performance measurements? Dedup can be costly and doesn't apply to many home use cases and use cases in general - but it's a huge deal for my workloads. In one crude dimension, that's about 11 MB/second. On my slow NAS, I'm backing up roughly 14.1TB and the level zero so far has taken (4x24)+20 hours to scan 6.5TB and upload 4.6TB. Regardless, wondering what some of the real world data points around backup performance are with ARQ 7.6. ![]() I can't remember if ARQ does deduplication of files ("single instance" style) or by variable segment, in which case the index could become huge. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |