OneMeaningManyNames

Full time smug prick

  • 22 Posts
  • 205 Comments
Joined 7 months ago
cake
Cake day: July 2nd, 2024

help-circle
  • Lavabit

    Connection to Edward Snowden

    Lavabit received media attention in July 2013 when it was revealed that Edward Snowden was using the Lavabit email address [email protected] to invite human rights lawyers and activists to a press conference during his confinement at Sheremetyevo International Airport in Moscow.[16] The day after Snowden revealed his identity, the United States federal government served a court order, dated June 10, 2013, and issued under 18 USC 2703(d), a 1994 amendment of the Stored Communications Act, asking for metadata on a customer who was unnamed. Kevin Poulsen of Wired wrote that “the timing and circumstances suggest” that Snowden was this customer.[17] In July 2013 the federal government obtained a search warrant demanding that Lavabit give away the private SSL keys to its service, affecting all Lavabit users.[18] A 2016 redaction error confirmed that Edward Snowden was the target.[2]

    source

    But what is the status now? Also, I think in the years to come the jurisdiction will also play a role. If the service is in the soil of a country that can subpoeana the encryption keys, then nobody is really safe.










  • Fancier algorithms are not bad per se. They can be ultra-productive for many purposes. In fact, we take no issue with fancy algorithms when published as software libraries. But then only specially trained folks can seize their fruit, which it happens it is people working for Big Tech. Now, if we had user interfaces that could let the user control several free parameters of the algorithms and experience different feeds, then it would be kinda nice. The problem boils down to these areas:

    • near-universal social graphs (they have all the people enlisted)
    • exert total control on the algorithm parameters
    • infer personal and sensitive data points (user-modeling)
    • not ensuring informed consent on the part of the user
    • total behavioral surveillance (they collect every click)
    • manipulate the feed and observe all behavioral response (essentially human subject research for ads)
    • profiteering from the above while harming the user’s well being (unethical)

    Political interference and proliferation of fascist “ideas” is just a function that is possible if and only if all of the above are in play. If you take all this destructive shit away, a software that would let you explore vast amounts of data with cool algorithms through a user-friendly interface would not be bad in itself.

    But you see, that is why we say “the medium is the message” and that “television is not a neutral technology”. As a media system, television is so constructed so that few corporations can address the masses, not the other way round, nor people interact with their neighbor. For a brief point in time, the internet promised to subvert that, when centralized social media brought back the exertion of control over the messaging by few corporations. The current alternative is the Fediverse and P2P networks. This is my analysis.


  • If you model and infer some aspect of the user that is considered personal (eg de-anonymize) or sensitive (eg infer sexuality) by means of an inference system, then you are in the area of GDPR. Further use of these inferred data down the pipeline can be construed as unethical. If they want to be transparent about it they have to open-source their user-modeling and decision making system.



  • You think the Meta algorithm just sorts the feed for you? It is way more complex and it basically puts you on some very fine-grained clusters, then decides what to show to you, then collects your clicks and reactions and adjusts itself. For scale, no academic “research with human subjects” would be approved with mechanics like that under the hood. It is deeply unethical and invasive, outright dangerous for the individuals (eg teen self esteem issues, anorexias, etc, etc). So “algorithm-like features” is apples to oranges here.




  • OneMeaningManyNames@lemmy.mltoMemes@lemmy.mlHow the turntables...
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    3
    ·
    6 days ago

    The premise of this meme is overly simplistic. Effectively equating a social media platform with a website hosting specific beliefs.

    Here are, from the top of my head, some ways Big Social is different, regardless of country.

    • Hosting a platform with millions or billions of users.
    • Exploiting algorithms that mine sensitive data to an invasive degree.
    • Control the flow of information, to a very granular degree of precision.
    • Experimentally collecting behavioral data in response to said control of information.
    • Modeling user’s life expectancy, sexual orientation, political beliefs, consumer patterns, terminal illnesses.
    • Selling said data and model outputs to private insurance companies as well as police states.
    • Addicting users to withdraw from real life, and get hooked to their screen where they can happily serve the company for data mining.

    I hardly think that any of the above should be gauged by the standards of individual rights to free speech. Even corporate entities viewed as individuals with a right to free speech.

    This is something else entirely, and whoever owns it, out of whichever country must have their ass regulated off.

    Even harder than the EU did.

    Operations of this type and size should be eventually dismantled. They are inherently antisocial, corporatist, and totalitarian in their conception and daily function.

    Sometime ago I started a discussion about the “Role of Attrition” in the effort to dismantle Big Social enterprises Here it is