• 0 Posts
  • 37 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle
  • The open availability of cutting-edge models creates a multiplier effect, enabling startups, researchers, and developers to build upon sophisticated AI technology without massive capital expenditure. This has accelerated China’s AI capabilities at a pace that has shocked Western observers.

    Didn’t a Google engineer put out a white paper about this around the time Facebook’s original LLM weights leaked? They compared the rate of development of corporate AI groups to the open source community and found there was no possible way the corporate model could keep up if there were even a small investment in the open development model. The open source community was solving in weeks open problems the big companies couldn’t solve in years. I guess China was paying attention.


  • It’s not disingenuous. There’s multiple definitions of “offline” being used here, and just because some people aren’t using yours doesn’t mean they’re ignorant or arguing in bad faith.

    Your definition of “offline” is encompassing just the executable code. So under that definition, sure, it’s offline. But I wouldn’t call an application “offline” if it requires an internet connection for any core feature of the application. And I call saving my document a core feature of a word processor. Since I wouldn’t call it “offline” I’m not sure what I would call it, but something closer to “local” or “native” to distinguish it from a cloud based application with a browser or other frontend.


  • Ah, I think I misread your statement of “followers by nature” as “followers of nature.” I’m not really willing to ascribe personality traits like “follower” or “leader” or “independent” or “critical thinker” to humanity as a whole based on the discussion I’ve laid out here. Again, the possibility space of cognition is bounded, but unimaginatively large. What we can think may be limited to a reflection of nature, but the possible permutations that can be made of that reflection are more than we could explore in the lifetime of the universe. I wouldn’t really use this as justification for or against any particular moral framework.


  • I think that’s overly reductionist, but ultimately yes. The human brain is amazingly complex, and evolution isn’t directed but keeps going with whatever works well enough, so there’s going to be incredible breadth in human experience and cognition across everyone in the world and throughout history. You’ll never get two people thinking exactly the same way because of the shear size of that possibility space, despite there having been over 100 billion people to have lived in history and today.

    That being said, “what works” does set constraints on what is possible with the brain, and evolution went with the brain because it solves a bunch of practical problems that enhanced the survivability of the creatures that possessed it. So there are bounds to cognition, and there are common patterns and structures that shape cognition because of the aforementioned problems they solved.

    Thoughts that initially reflect reality but that can be expanded in unrealistic ways to explore the space of possibilities that an individual can effect in the world around them has clear survival benefits. Thoughts that spring from nothing and that relate in no way to anything real strike me as not useful at best and at worst disruptive to what the brain is otherwise doing. Thinking in that perspective more, given the powerful levels of pattern recognition in the brain, I wonder if creation of “100% original thoughts” would result in something like schizophrenia, where the brain’s pattern recognition systems are reinterpreting (and misinterpreting) internal signals as sensory signals of external stimuli.


  • The problem with that reasoning is it’s assuming a clear boundary to what a “thought” is. Just like there wasn’t a “first” human (because genetics are constantly changing), there wasn’t a “first” thought.

    Ancient animals had nervous systems that could not come close to producing anything we would consider a thought, and through gradual, incremental changes we get to humanity, which is capable of thought. Where do you draw the line? Any specific moment in that evolution would be arbitrary, so we have to accept a continuum of neurological phenomena that span from “not thoughts” to “thoughts.” And again we get back to thoughts being reflections of a shared environment, so they build on a shared context, and none are original.

    If you do want to draw an arbitrary line at what a thought is, then that first thought was an evolution of non-/proto-thought neurological phenomena, and itself wasn’t 100% “original” under the definition you’re using here.


  • From your responses to others’ comments, you’re looking for a “thought” that has absolutely zero relationship with any existing concepts or ideas. If there is overlap with anything that anyone has ever written about or expressed in any way before, then it’s not “100% original,” and so either it’s impossible or it’s useless.

    I would argue it’s impossible because the very way human cognition is structured is based on prediction, pattern recognition, and error correction. The various layers of processing in the brain are built around modeling the world around us in a way to generate a prediction, and then higher layers compare the predictions with the actual sensory input to identify mismatches, and then the layers above that reconcile the mismatches and adjust the prediction layers. That’s a long winded way to say our thoughts are inspired by the world around us, and so are a reflection of the world around us. We all share our part of this world with at least one other person, so we’re all going to share commonalities in our thoughts with others.

    But for the sake of argument, assume that’s all wrong, and someone out there does have a truly original, 100% no overlap with anything that has come before, thought. How could they possibly express that thought to someone else? Communication between people relies on some kind of shared context, but any shared context for this thought means it’s dependent on another idea, or “prior art,” so it couldn’t be 100% original. If you can’t share the thought with anyone, nor express it in any way to record it (because that again is communication), it dies with you. And you can’t even prove it without communicating, so how would someone with such an original thought convince you they’ve had it?


  • Math, physics, and to a lesser extent, software engineering.

    I got degrees in math and physics in college. I love talking about counterintuitive concepts in math and things that are just way outside everyday life, like transfinite numbers and very large dimensional vector spaces.

    My favorite parts of physics to talk about are general relativity and the weirder parts of quantum mechanics.

    My day job is software engineering, so I can also help people get started learning to program, and then the next level of building a solid, maintainable software project. It’s more “productive” in the traditional sense, so it’s satisfying to help people be more productive, but when it’s just free time to shoot the shit, talking about math and science are way more fun.


  • I’m sorry, I mostly agree with the sentiment of the article in a feel-good kind of way, but it’s really written like how people claim bullies will get their comeuppance later in life, but then you actually look them up later and they have high paying jobs and wonderful families. There’s no substance here, just a rant.

    The author hints at analogous cases in the past of companies firing all of their engineers and then having to scramble to hire them back, but doesn’t actually get into any specifics. Be specific! Talk through those details. Prove to me the historical cases are sufficiently similar to what we’re starting to see now that justifies the claims of the rest of the article.




  • Robin Williams as the Bicentennial Man. The movie was okay; his performance was amazing. I’ve struggled with mortality for a while, like I expect a lot of people do, and to see him as a character who started their existence immortal, and to choose mortality. His death in the movie hit me much, much harder than I expected. I haven’t watched the movie again since my first viewing because I’m honestly afraid of going through that again.




  • Not OP, but in my circles the simplest, strongest point I’ve found is that no cryptocurrency has a built-in mechanism for handling mistakes. People are using these systems, and people make mistakes. Without built in accommodations, you’re either

    1. Creating real risk for anyone using the system, because each mistake is irrecoverable financial loss, and that’s pretty much the definition of financial risk, or
    2. Encouraging users to subvert the system in its core functionality in order to accommodate mistakes, which undermines the entire system and again creates risk because you don’t really know how anything is going to work with these ad hoc side systems

    Either way, crypto is just more costly to use than traditional systems when you properly factor those risks. So the only people left using it are those who expect greater rewards to offset all that additional risk, which are just speculators and grifters.


  • I don’t think that follows, because those are temporary conditions, and consuming the drug is a choice made by an individual not currently under the influence. So it’s the person’s responsibility before they consume the drug to prepare their environment for when they are under the influence. If they’re so destructive under the influence that they can’t not commit a crime, it is their responsibility not to take the drug at all.


  • Been the only one in my family for years using Linux, but over the last few months struggles with Windows have basically resulted in all but one computer in the house being migrated to Linux.

    Put it on my 10-year-old son’s desktop because Windows parental controls have been made overly complicated and require Internet connectivity and multiple Microsoft accounts to manage. Switched to Linux Mint, installed the apt sources for the parental control programs, made myself an account with permissions and one for him without permissions to change the parental controls, and done. With Steam he can play all of the games in his library.

    Only my wife is still using Windows, but with ads embedded in the OS ramping up, and features she liked getting replaced with worse ones, she’s getting increasingly frustrated with Microsoft.


  • Democratic candidates have raised far more than Republicans and can purchase ads at the cheaper rate offered to candidates. Republicans rely more heavily on independent expenditures from their campaign arm and allied super PACs, which have to pay much more per ad.

    Gee, it’s almost like Republicans aren’t favored by a large proportion of the population who can donate up to the ~$3,300 federal limit directly to campaigns and have to rely on their wealthy benefactors donating much, much more per capita through side channels that shouldn’t even exist in a functional democracy.



  • Before my comment I want to make clear I agree with the conclusion that abortion bans are clearly killing women at statistically significant rates.

    That said, the stats reporting here doesn’t make sense:

    Among Hispanic women, the rate of women dying while pregnant, during childbirth or soon after increased from 14.5% in 2019 to 18.9% in 2022. Rates among white women nearly doubled — from 20% to 39.1%. And Black women, who historically have higher chances of dying while pregnant, during childbirth or soon after, saw their rates go from 31.6% to 43.6%.

    There’s no way 14.5% of Hispanic women in Texas who got pregnant died some time during pregnancy, during child birth, or soon after. That would be unprecedented for any time since the advent of modern medicine. And the chart above this paragraph does not agree with it either. It’s a chart of deaths per hundred THOUSAND live births, and the numbers for all racial groups are all under 100, so less than 0.1%.

    The way it’s stated also doesn’t suggest it’s a percent increase because it says it rose from 14.5% to 18.9%. I can’t figure out what they’re trying to say, but they should definitely have been more careful with presenting the numbers.