• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle










  • … have practically relaunched the same press release that praises Mammoth, the Mastodon app developed by The BLDV Inc. , the Californian start-up financed by Mozilla (and THEREFORE by Google, which now finances 90% of Mozilla)

    Thats a pretty crazy statement. Mozilla is not a shell company for google. They have very different goals, and google pays mozilla for google search as default because otherwise, firefox users would have a different search as default, giving another search engine a serious opportunity to get big.

    Mozilla funds many things, and this is just a well made open source client for mastodon. There is no secret big corp. agenda behind it because that would be crazy. Why wouldnt google develop such a thing in house and use their influence to bring users to it, and why use mozilla, the rival browser company, to do it? This is a major flaw in the article, serious enough that I cant take it seriously.




  • Just chiming in here to say that this is very much like security through obscurity. In this context the “secure” part is being sure that the images you host are ok.

    Bad actors using social engineering to get the banlist is much easier than using open source AI and collectivly fixing the bugs when the trolls manage to evade it. Its not that easy to get around image filters like this, and having to do wierd things to the pictures to be able to pass the filter barrier could be work enough to keep most trolls from posting. Using a central instance that filters all images is also not good, because now the person operating the service is now responsible for a large chunk of your images, creating a single point of failure in the fediverse(and something that could be monetised to our detriment) Closed source can not be the answer either because if someone breaks the filter, the community cant fix it, only the developer can. So either the dev team is online 24/7 or its paid, making hosting a fediverse server dependent on someones closed source product.

    I do think however that disabling image federation should be an option. Turning image federation off for some server for a limited time could be a very effective tool against these kinds of attacks!


  • Well, thats how it generally worked as far as I know. Im not saying that you can host illegal stuff as long as no one reports it. Im saying its impossible to know instantly if someones posting something illegal to your server, youd have to see it first. Or else pretty much the entire internet would be illegal, because any user can upload something illegal at any time, and youd instantly be guilty of hosting illegal content? I doubt it.



  • Ah sorry, I didnt know that there is an attack going on currently, i just saw a bunch of posts about lemmy being illegal to operate because of the risk of CP federation. And then this post which seemed to imply that one needs constant automated illegal content filtering, which as far as i know isnt required by law, unless you operate a major service that is reachable in the EU, and fediverse servers arent major enough for that.