Skip Navigation
decision time rule
  • Dual power refers to a situation where two separate governing bodies or authorities coexist within a single country or region, each claiming legitimacy and competing for control or influence. This typically emerges during periods of political or social upheaval when an alternative power structure is established alongside the existing one. The idea is that by building alternative institutions or systems, people can challenge and eventually replace the current dominant power structures.

  • YouTube isn't happy you're using ad blockers — and it's doing something about it
  • On Android the Firefox app allows the use of extensions including all the adblock options you would find on PC. Works great and for YouTube as well.

  • Loading Artist » User Terror
  • Poltergeist

  • Everything to keep disguise
  • According to the lore, demons are fallen angels so you can keep this narrative going

  • My baby boi Radar tested positive for lyme disease yesterday :(
  • I had Lyme Disease and it sucked, felt like a zombie walking through a sick foggy dream world until I got antibiotics. Luckily I noticed pretty quick it was from a deer tick. I hope your pupper gets treated and has a full recovery 😁

  • Be there for your ladies
  • Haha I can hear the sad music now, while slaying away those Elder Willows. There was something almost meditative about collecting those DB's.

    At one point I set up a bot to collect them, he grinded away so long on those that I swear he reached somewhere in the 90's for his level

  • Be there for your ladies
  • Your username gave me some major nostalgia. I think I played iRO like nearly 20 years ago! Even rolled a Super Novice lol

  • QBittorrent is better though.
  • omg this got me off hard*

    FTFY

  • stop asking for a karma system
  • You're correct, the entire system is already in place. The only thing that is currently missing is adding up all of someone's 'karma' from their their posts and having it shown on their profile. Some of the apps already have this implemented since it's easy to incorporate.

  • Musk's new idea
  • This is a great summary of the fall of Elon Musk. I admired him too years ago until he went off the rails. RIP to the better decisions and the their benefits that this man could have made.

  • Another screenshot LJ shared!
  • I hope there's an option to see the upvotes as well as downvotes rather than just the total. That's one thing I've liked about Jerboa so far that other apps aren't doing.

  • Our Banjo Boy
  • Do you also have a bird named Kazooie?

  • Lemmy Just Reached 1 Million Posts
  • Always has bean

  • GBA I modded to match a watch I modded.
  • Your modded color scheme and that splash screen 👌

  • Rule 1
  • I agree, we are still making important renewable advancements that are being used more and more.

  • Rule 1
  • The problem is that at any point we could have dealt with it, but greed seems to be preventing that to the point that we may need a miracle tech breakthrough that is so great, that fixing this issue would be trivial.

  • Rule 1
  • I wouldn't put it past this timeline, that's for sure

  • Rule 1
  • Unless we get fusion and/or general AI... Yup.

  • I logged in to Reddit today
  • It hurts my soul knowing all of this useful content was removed from reddit.

    I've moved on to Lemmy, but I'm having a hard time removing everything from reddit. I know with so many people removing their old posts/replies that it will definitely hurt reddit, so many googled content won't be there.

    It would be cool if we could transfer our old posts here in some sort of meaningful way, but I don't see how that could happen in a way that makes sense.

  • TIL about Wu-Tang Clan's 'Once Upon a Time in Shaolin,' an album so exclusive, only one copy was ever produced.

    > cross-posted from: https://lemmy.world/post/937094 > > > - There are no circulating copies of the album online and it cannot be commercially exploited until 2103, but it can be played at listening parties. > > > > - It took about six years to record, with features from the entire Wu-Tang Clan, Redman, Cher, and even FC Barcelona soccer players and a Game of Thrones actress. > > > > - The album is unique with only one physical copy in existence, making it the most expensive work of music ever sold. > > > > - The album was bought by Turing Pharmaceuticals CEO, Martin Shkreli, for $2 million, who later lost it when his assets were seized following his conviction for securities fraud. > > > > - In 2021, it was bought by PleasrDAO, a non-fungible token (NFT) collectors' group, for $4 million to cover Shkreli's debts. PleasrDAO hopes to make it more accessible within the confines of 'listening parties'. > >

    0
    TIL about Wu-Tang Clan's 'Once Upon a Time in Shaolin,' an album so exclusive, only one copy was ever produced.

    > - There are no circulating copies of the album online and it cannot be commercially exploited until 2103, but it can be played at listening parties. > > - It took about six years to record, with features from the entire Wu-Tang Clan, Redman, Cher, and even FC Barcelona soccer players and a Game of Thrones actress. > > - The album is unique with only one physical copy in existence, making it the most expensive work of music ever sold. > > - The album was bought by Turing Pharmaceuticals CEO, Martin Shkreli, for $2 million, who later lost it when his assets were seized following his conviction for securities fraud. > > - In 2021, it was bought by PleasrDAO, a non-fungible token (NFT) collectors' group, for $4 million to cover Shkreli's debts. PleasrDAO hopes to make it more accessible within the confines of 'listening parties'. >

    2
    TIL about Wu-Tang Clan's 'Once Upon a Time in Shaolin,' an album so exclusive, only one copy was ever produced.

    > - There are no circulating copies of the album online and it cannot be commercially exploited until 2103, but it can be played at listening parties. > > - It took about six years to record, with features from the entire Wu-Tang Clan, Redman, Cher, and even FC Barcelona soccer players and a Game of Thrones actress. > > - The album is unique with only one physical copy in existence, making it the most expensive work of music ever sold. > > - The album was bought by Turing Pharmaceuticals CEO, Martin Shkreli, for $2 million, who later lost it when his assets were seized following his conviction for securities fraud. > > - In 2021, it was bought by PleasrDAO, a non-fungible token (NFT) collectors' group, for $4 million to cover Shkreli's debts. PleasrDAO hopes to make it more accessible within the confines of 'listening parties'. >

    3
    Today I learned @lemmy.ml Hopps @lemmy.world
    TIL about Wu-Tang Clan's 'Once Upon a Time in Shaolin,' an album so exclusive, only one copy was ever produced.
    • There are no circulating copies of the album online and it cannot be commercially exploited until 2103, but it can be played at listening parties.

    • It took about six years to record, with features from the entire Wu-Tang Clan, Redman, Cher, and even FC Barcelona soccer players and a Game of Thrones actress.

    • The album is unique with only one physical copy in existence, making it the most expensive work of music ever sold.

    • The album was bought by Turing Pharmaceuticals CEO, Martin Shkreli, for $2 million, who later lost it when his assets were seized following his conviction for securities fraud.

    • In 2021, it was bought by PleasrDAO, a non-fungible token (NFT) collectors' group, for $4 million to cover Shkreli's debts. PleasrDAO hopes to make it more accessible within the confines of 'listening parties'.

    1
    news.mit.edu MIT researchers make language models scalable self-learners

    MIT CSAIL researchers used a natural language-based logical inference dataset to create smaller language models that outperformed much larger counterparts.

    MIT researchers make language models scalable self-learners

    TLDR Summary:

    • MIT researchers developed a 350-million-parameter self-training entailment model to enhance smaller language models' capabilities, outperforming larger models with 137 to 175 billion parameters without human-generated labels.

    • The researchers enhanced the model's performance using 'self-training,' where it learns from its own predictions, reducing human supervision and outperforming models like Google's LaMDA, FLAN, and GPT models.

    • They developed an algorithm called 'SimPLE' to review and correct noisy or incorrect labels generated during self-training, improving the quality of self-generated labels and model robustness.

    • This approach addresses inefficiency and privacy issues of larger AI models while retaining high performance. They used 'textual entailment' to train these models, improving their adaptability to different tasks without additional training.

    • By reformulating natural language understanding tasks like sentiment analysis and news classification as entailment tasks, the model's applications were expanded.

    • While the model showed limitations in multi-class classification tasks, the research still presents an efficient method for training large language models, potentially reshaping AI and machine learning.

    0
    scitechdaily.com Accelerating Drug Discovery With the AI Behind ChatGPT – Screening 100 Million Compounds a Day

    By applying a language model to protein-drug interactions, researchers can quickly screen large libraries of potential drug compounds. Huge libraries of drug compounds may hold potential treatments for a variety of diseases, such as cancer or heart disease. Ideally, scientists would like to exper

    Accelerating Drug Discovery With the AI Behind ChatGPT – Screening 100 Million Compounds a Day

    TLDR summary:

    1. Researchers at MIT and Tufts University have developed an AI model called ConPLex that can screen over 100 million drug compounds in a day to predict their interactions with target proteins. This is much faster than existing computational methods and could significantly speed up the drug discovery process.

    2. Most existing computational drug screening methods calculate the 3D structures of proteins and drug molecules, which is very time-consuming. The new ConPLex model uses a language model to analyze amino acid sequences and drug compounds and predict their interactions without needing to calculate 3D structures.

    3. The ConPLex model was trained on a database of over 20,000 proteins to learn associations between amino acid sequences and structures. It represents proteins and drug molecules as numerical representations that capture their important features. It can then determine if a drug molecule will bind to a protein based on these numerical representations alone.

    4. The researchers enhanced the model using a technique called contrastive learning, in which they trained the model to distinguish real drug-protein interactions from decoys that look similar but do not actually interact. This makes the model less likely to predict false interactions.

    5. The researchers tested the model by screening 4,700 drug candidates against 51 protein kinases. Experiments confirmed that 12 of the 19 top hits had strong binding, including 4 with extremely strong binding. The model could be useful for screening drug toxicity and other applications.

    6. The new model could significantly reduce drug failure rates and the cost of drug development. It represents a breakthrough in predicting drug-target interactions and could be further improved by incorporating more data and molecular generation methods.

    7. The model and data used in this research have been made publicly available for other scientists to use.

    0
    AI Translates 5000-Year-Old Cuneiform

    A team from Israel has developed an AI model that translates Cuneiform, a 5000-year-old writing system, into English within seconds. This model, developed at Tel Aviv University, uses Neural Machine Translation (NMT) and has fairly good accuracy. Despite the complexity of the language and age, the AI was successfully trained and can now help to uncover the mysteries of the past. You can try an early demo of this model on The Babylon Engine and its source code is available on GitHub on Akkademia and the Colaboratory.

    0
    Help our Machine Learning Community thrive: Seeking passionate participants!
    lemmy.world Machine Learning - Lemmy.world

    A Machine Learning community.

    Hey there! I've started a new machine learning community and it's ready for fresh voices and perspectives! Right now, it's just me and a couple of posts, but I'm excited to see where we can take this with your input. All experience levels are welcome. Whether you want to discuss the latest in ML, ask questions, or simply learn, this is the place. Come, be among the first contributors and let's shape this community together!

    7
    Meta AI Reveals Game-Changing I-JEPA: A Leap Forward in Self-Supervised Learning Mimicking Human Perception and Reasoning

    Meta AI has revealed their first AI model, I-JEPA, which learns by comparing abstract representations of images, not the pixels. This self-supervised learning model fills in knowledge gaps in a way that mirrors human perception. I-JEPA is adaptable and efficient, offering robust performance even with a less complex model. Excitingly, the code for this pioneering technology is open-source. Check it out on GitHub!

    0
    13b parameter Orca LLM is redefining what small model LLM's are capable of.
    docs.kanaries.net Orca 13B: the New Open Source Rival for GPT-4 from Microsoft

    Experience the cutting-edge Orca 13b AI model from Microsoft, now small enough to run on your laptop. Learn from GPT 4 and imitate reasoning processes with ease.

    0
    New threads populating on main page

    When I'm at the main lemmy.world page, at the top of the page new posts are popping up as I'm trying to browse posts - is there any way to stop this?

    3
    Hopps Hopps @lemmy.world
    Posts 13
    Comments 71
    Moderates