• 0 Posts
  • 67 Comments
Joined 2 years ago
cake
Cake day: June 22nd, 2023

help-circle

  • Even more specifically, if we are talking a temporal teleport, then this shouldn’t be a surprise. Most mainstream fiction uses teleports for time travel, pop out of one time and into another without experiencing the time between. As opposed to the device Farnsworth made in The Late Philip J. Fry, where they actually just change speed through time instead of skipping through it. In the latter case, you shouldn’t have to worry about this issue at all. But with a teleport, any teleportation device is simultaneously a temporal and spatial teleport, due to causality and the nature of spacetime. So any teleport would need spacetime coordinates, not just spatial or temporal coordinates.


  • It wasn’t as unrelated as it might appear. Firstly, they used their D+ account to make their Disney account. Secondly, the whole point of that argument was that in the Disney account EULA, the relevant one, there is an arbitration clause. They only brought up the D+ account in passing because it has the same clause, emphasizing that they had to read and agree to the clause twice, and if they didn’t catch it it’s not Disney’s fault they lied about reading it. They basically said “look, this is an issue regarding the Disney account, and they said right here they read and understood the terms that include arbitration. And here, they read and agreed to the exact same terms a few months earlier on D+. This shouldn’t be any surprise if they were truthful when they claimed to have read it.”

    Disclaimer, arbitration clauses are bullshit and need to be reworked/eliminated as they are generally very anticonsumer and I don’t think it’s good that they have that clause. But accepting that this exists, Disney didn’t really do anything particularly scummy.




  • Never mention it. They will often ask questions about how you think a juror should or can act. If you answer them in a way that shows you might know about nullification, you are out. If you then later admit you know about it, they will point to those questions and know you lied about them. Safest answer is to just never, ever use the term, ideally you should go through the motions in deliberation of putting the the rules together, like you are just realizing it’s a possibility then and there.











  • AEsheron@lemmy.worldtoLemmy Shitpost@lemmy.worldSocialism
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    4 months ago

    Sapient, not sentient. Sci-fi has co-opted the word, but sentient basically means able to feel emotions. There are plenty of sentient species right here at home. Sapient is the word sci-fi usually wants, there are no known sapient species aside from humans. Though some may argue that a couple other animals may qualify, it’s a very fuzzy concept that is hard to identify with a being unable to communicate abstract concepts.



  • If it slowed down it would get closer, not further. The truth is, any orbit is only stable given a specific timeframe. The longer that timeframe, the less likely any given orbit is to remain. The moon has just a little bit more speed than the Earth can hold onto, so it is in an extremely slow escape, and always has been.


  • The hippocratic oath, in this case. Medicine is all about risk management, the worse the “disease,” the more tolerant we are of side effects for the cure. Pregnancy and birth are still pretty traumatic events that, while much safer than they used to be, are still dangerous. Female BC just has to be less risky than that. Male BC on the other hand, has to be as low the risk for a man impregnating a woman, which is to say, almost zero. Pretty much any negative side effect is worse than that, so it’s very difficult to pass. I would gladly take one with comparable side effects to female BC, but sometimes unflinching ethics are inconvenient. Better than the alternative, but still.


  • I agree, but it isn’t so clear cut. Where is the cutoff on complexity required? As it stands, both our brains and most complex AI are pretty much black boxes. It’s impossible to say this system we know vanishingly little about is/isn’t dundamentally the same as this system we know vanishingly little about, just on a differentscale. The first AGI will likely still have most people saying the same things about it, “it isn’t complex enough to approach a human brain.” But it doesn’t need to equal a brain to still be intelligent.