Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JC
Posts
9
Comments
485
Joined
2 yr. ago

  • Petty theft rings too true. Had a friend that worked at one of those bulk ingredient shops who'd regularly just take home like a kilo of rice or flour. They don't check anyway and it hardly affects their bottom line.

  • hygiene

    Jump
  • Having tried simple bidets in both warm, cold, and neutral-ish climates, I find that cold water bidets seem to stiffen the poo bits and make it hard to actually get them off your butt esp since they stick to the hairs. You and I might be talking about different levels of cold, though.

  • You should give Claude Code a shot if you have a Claude subscription. I'd say this is where AI actually does a decent job: picking up human slack, under supervision, not replacing humans at anything. AI tools won't suddenly be productive enough to employ, but I as a professional can use it to accelerate my own workflow. It's actually where the risk of them taking jobs is real: for example, instead of 10 support people you can have 2 who just supervise the responses of an AI.

    But of course, the Devil's in the detail. The only reason this is cost effective is because of VC money subsidizing and hiding the real cost of running these models.

  • Compilation is CPU bound and, depending on what language mostly single core per compilation unit (I.e. in LLVM that's roughly per file, but incremental compilations will probably only touch a file or two at a time, so the highest benefit will be from higher single core clock speed, not higher core count). So you want to focus on higher clock speed CPUs.

    Also, high speed disks (NVME or at least a regular SSD) gives you performance gains for larger codebases.

  • I think the main barriers are context length (useful context. GPT-4o has "128k context" but it's mostly sensitive to the beginning and end of the context and blurry in the middle. This is consistent with other LLMs), and just data not really existing. How many large scale, well written, well maintained projects are really out there? Orders of magnitude less than there are examples of "how to split a string in bash" or "how to set up validation in spring boot". We might "get there", but it'll take a whole lot of well written projects first, written by real humans, maybe with the help of AI here and there. Unless, that is, we build it with the ability to somehow learn and understand faster than humans.

  • People seem to disagree but I like this. This is AI code used responsibly. You're using it to do more, without outsourcing all your work to it and you're actively still trying to learn as you go. You may not be "good at coding" right now but with that mindset you'll progress fast.

  • As a former script kiddie myself I think it's not much different from how I used to blindly copy and paste code snippets from tutorials. Well, environmental impact aside. Those who have the drive and genuine interest will actually come to learn things properly. Those who don't should stay tf out of production code, it's already bad enough. Which is why we genuinely shouldn't let "vibe coding" be legitimized.

  • Technology @lemmy.world

    What are your AI use cases?

    Selfhosted @lemmy.world

    Help trying to set up an Ubuntu server as a router w/ a failover interface...

    Selfhosted @lemmy.world

    Gigabit switch on a non-gigabit router?

    Lemmy Shitpost @lemmy.world

    Are you seeing this?

    Memes @lemmy.ml

    One of the world's first memes

    Test @lemm.ee

    Other Test Post

    Selfhosted @lemmy.world

    Hybrid email setup w/ managed server but self-hosted client

    Selfhosted @lemmy.world

    Is there a way to have the same domain point to a local IP and remote depending on network?

    Lemmy @lemmy.ml

    Lemmy hotfix for home page bugs...