Em Adespoton

  • 0 Posts
  • 2.35K Comments
Joined 3 years ago
cake
Cake day: June 4th, 2023

help-circle

  • The idea is to verify the archival copy’s URL, not to verify the original content. So yes, a server could push different content to the archiver than to people, or vary by region, or an AitM could modify the content as it goes out to the archiver. But adding the sha256 in the URL query parameter means that if someone publishes a link to an archive copy online, anyone else using the link can know they’re looking at the same content the other person was referencing.

    If the archive content changes, that URL will be invalid; if someone uses a fake hash, the URL will be invalid (which is why MD5 wouldn’t be appropriate).

    The beauty of this technique is that query parameters are generally ignored if unsupported by the web server, so any archival service could start using this technique today, and all it would require is a browser extension to validate the parameter.

    Link it to something like Web of Trust, and you’ve solved the separate issue you described.

    In fact, this is a feature WoT could add to their extension today, and it would “Just Work”. For that matter, Archive.org could add it to their extension today, too.

    Remind me to ping Jason about that.





  • Of course it’s a broad generalization; my point was that the tent cities aren’t predominantly made up of people who hit the kill line, that the situation isn’t that simple.

    Fixing the kill line won’t affect tent cities that much; there are further societal issues that are also at play.

    In China, substance abuse is handled differently as is mental health issues… so you don’t end up with those people in tent cities either.

    This doesn’t mean those people don’t exist in a state of suffering; it just means they’re not publicly visible.



  • He only modified archived pages in response to a dox attempt?

    And the thing is, the discovery of the modified pages revealed that it wasn’t even the first time he’d modified pages. And he used a real person’s identity to try and shift blame.

    Irrespective of the doxxing allegations, if he’s done all this multiple times already, it means the page archives can’t be trusted AND there’s no guarantee that anything archived with the service will be available tomorrow.

    Seems like we need to switch to URLs that contain the SHA256 of the page they’re linking to, so we can tell if anything has changed since the link was created.










  • It uses a completely different paradigm of process chaining and management than POSIX and the underlying Unix architecture.

    That’s not to say it’s bad, just a different design. It’s actually very similar to what Apple did with OS X.

    On the plus side, it’s much easier to understand from a security model perspective, but it breaks some of the underlying assumptions about how scheduling and running processes works on Linux.

    So: more elegant in itself, but an ugly wart on the overall systems architecture design.