Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)MA
Posts
2
Comments
206
Joined
2 yr. ago

  • While these investors are absolutely soulless and deserve to be called out, there's another aspect of this problem that I feel doesn't get talked about enough.

    If we just built enough housing this problem would go away. And it would be easy if we had a system that allowed people to build new things and undercut competition. But we can't because regulations make it nearly impossible to make anything other than houses.

    People investing on houses are a symptom of the larger overall problem, of there not being enough fucking housing.

  • As a current PhD student, I am very skeptical of theories that present themselves as alternatives to dark matter.

    Dark matter is a very successful model of explaining our universe, with a ton of validation through simulations that can recreate a lot of what we see.

    Unless another theory can do that better than dark matter, it is hard to consider it favorably.

  • At its face value, base elements are not enormously complicated. But we can't even properly model any element other than hydrogen, it's all approximations because quantum mechanics is so complicated. And then there's molecules, that are even more hopelessly complicated, and we haven't even gotten to proteins! By comparison our best transistors look like toys.

  • I did not immediately dismiss LLM, my thoughts come from experience, observing the pace of improvement, and investigating how and why LLMs work.

    They do not think, they simply execute an algorithm. Yeah that algorithm is exceedingly large and complicated, but there's still no thought, there's no learning outside of training. Unlike humans who are always learning, even if they don't look like it, and our brains are constantly rewiring themselves, LLMs don't.

    I'm certain in the future we will get true AI, but it's not here yet.

  • Yeah a lot of it is messy, but they are not being replicated by commodity gpus.

    LLMs have no intelligence. They are just exceedingly well at language, which has a lot of human knowledge in it. Just read claudes system prompt and tell me it's still smart, when it needs to be told 4 separate times to avoid copyright.

  • Current Ai has no shot of being as smart as humans, it's simply not sophisticated enough.

    And that's not to say that current llms aren't impressive, they are, but the human brain is just on a whole different level.

    And just to think about on a base level, LLM inference can run off a few gpus, roughly order of 100 billion transistors. That's roughly on par with the number of neurons, but each neuron has an average of 10,000 connections, that are capable of or rewiring themselves to new neurons.

    And there are so many distinct types of neurons, with over 10,000 unique proteins.

    On top of there over a hundred neurotransmitters, and we're not even sure we've identified them all.

    And all of that is still connected to a system that integrates all of our senses, while current AI is pure text, with separate parts bolted onto it for other things.

  • It's possible primordial black holes could be dark matter, but there's only a certain mass range allowed, roughly around the mass of an asteroid, for example.

    But somewhat confusingly, these would be poor candidates to seed these massive black holes.

  • I do research on dark matter, and one of the more interesting possibilities I have heard is that these black holes could in theory be formed by directly collapsing dark matter!

    There's and increasing amount of attention being spent investigating a slight modification to the standard Lambda cold dark matter cosmology, by allowing dark matter to have a little bit of self-interactions. These can then allow part of a larger dark matter halo to directly collapse into a black hole.

  • Ironically the reason we can't keep up with car infrastructure is because there's too much of it.

    It much more costly to maintain, especially when scaling to more lanes.

    Reducing space given to cars and giving more to bikes/buses/trains would make it easier to upkeep our current roads.

  • No offenses, but I'm gonna put a lot more weight behind a peer reviewed Nature paper, rather than some random podcaster.

    The explained their methodology pretty well. They extrapolate the microplastics amount from a small bit of cortical tissue, and compared it to previous results. Yeah there might not be as much in other parts of the brain, but we don't have a reason to think it would be drastically different.

  • Based on Wikipedia I think the unit is milligals, where a gal is 1 centimeter per second squared.

    That is a bit of an odd unit where g is ~980,000 milligals.

    So these changes are extremely small.

    So then the total variation is all within about 0.02% of gravities normal value

  • Damn

    Jump
  • Newton's laws, including gravity and motion, can be expressed in terms of differential equations.

    Differential equations pretty much requires calculus, which just hadn't been formalized yet.

    Newton's laws also provide convincing reasons for the necessity and legitimacy of calculus, by being able to derive orbits from his simple laws.

  • In America, the housing problem is significantly contributed by over-regulations. In most areas, it is only legal to build single family homes. It shouldn't be a surprise that we can't build enough homes for everyone with only houses.

    That's not the only issue either.

  • Science Memes @mander.xyz

    Somebody is having a fun day

    Deep Rock Galactic @lemmy.world

    So now that season 4 has been out a bit, what are your thoughts?