• AmbiguousProps
    link
    fedilink
    English
    arrow-up
    170
    arrow-down
    2
    ·
    8 months ago

    As much as I hate them, this is likey because a customer misconfigured their bucket and not on Amazon.

    • cybersandwich@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      8 months ago

      I have never configure s3 buckets for an enterprise personally, but I have used AWS for some personal projects. The control panel pretty clearly warns you if you try to open the bucket to the public. “This is unsafe. Everyone can see everything you idiot!”

      They must be doing it through the CLI.

    • pop@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      16
      ·
      edit-2
      8 months ago

      There’s no reason for amazonaws.com to be on search engine at all. Which is just as simple as placing a robots.txt with deny all declaration. Then no user would have to worry about shit like this.

      • Moonrise2473@feddit.it
        link
        fedilink
        arrow-up
        13
        ·
        8 months ago

        Who said that?

        Many other customers instead want to get that, maybe they are hosting images for their website on S3, or other public files that are meant to be easily found

        If the file isn’t meant to be public, then it’s the fault of the webmaster which placed it on a public bucket or linked somewhere in a public page

        Also: hosting files on Amazon S3 is super expensive compared to normal hosting, only public files that are getting lots of downloads should be using that. A document that’s labeled for “internal use only” should reside on a normal server where you don’t need the high speed or high availability of AWS and in this way you can place some kind of web application firewall that restricts access from outside the company/government.

        For comparison, it’s like taking a $5 toll road for just a quarter of mile at 2 am. There’s no traffic and you’re not in hurry, you can go local and save that $5

      • AmbiguousProps
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 months ago

        robots.txt doesn’t have to be followed. It doesn’t block crawling.