Skip Navigation
Have you ever used git bisect?
  • I've used it only once to find a bug in a section of code at a company I was working for that both me and the other engineer I was working with did not know the history of. We were able to triangulate effectively the root of the pre-existing bug, and kind of how it was introduced, because of the surrounding history. Very useful tool for this purpose, albeit one that I use very infrequently.

  • Flogging Molly - Revolution
  • I had no idea Bernie used FM -- awesome!

    These fine folk will always have a special place in my music library. Best concert I ever went to in the late 00's, a memory i wish everyone could share.

  • what's your favorite Gameboy game that isn't Pokemon or Mario?
  • Same here! They made a DS one as a third installment, but the story leaves off in the middle, and the sense of awe just isn't there. Not the same anyway as 1&2. Especially 2, when you [SPOILER] play the "bad" guys

  • How often do you have to resolve merge conflicts?
  • It generally happens often without pre-planning and modularizing sections of developed code (even harder with data science, given the often functional nature of data science code bases). But when it does happen, it doesn't have to be aweful.

    To resolve, I generally just create a safe local branch to temporarily complete the merge in that I created form my local copy, and then I pull in the remote copy that I want to merge in with mine using git merge -X theirs ${THEIR_BRANCH_NAME}, which favors their remote changes over yours (I assume origin is more correct than me). Then, conflicts will arise, and you manually perform diffs and checkin the final version with conflicts resolved as a new commit locally. Once complete, it is generally safe to push that temp branch to the remote or your fork for a Pull Request submission, or you may merge the temp branch with the conflict resolves into your running branch. Either way, before the PR, make sure to run tests with the integrated changes first, and to pull merged remote afterwards to fast forward your running copy (such as with git merge -X theirs origin/${HEAD} or git pull origin/${HEAD}

    Best answer though: pre plan your code base to include some modularity so that 2 people aren't actively working on the same file at once, encourage daily check-ins to remotes and daily pulls, and ensure that headless unit tests are in place for critical areas, such as logic and boundary cases, at minimum (and that those run in CI/CD). +1 if you use uniform docker tooling to ensure all environments, even local, are the same. And another +1 if you have good telemetry based on APM metrics and traces for after code is integrated.

  • Total meat (flesh) supply may be a significant risk factor for cardiovascular diseases worldwide
  • I think they may already be onto the root cause, and it's surprising. According to a study published in 2013 (https://pubmed.ncbi.nlm.nih.gov/23563705/) and some related heart studies as recently as last year, L-carnitine digestion by the gut microbiome effectively fosters certain gut bacteria that release chemicals in the blood that create conditions for artherosclerosis.

    i.e., it might be our gut mocrobiome at work in a negative way, where genes play a role in how our bodies and microbiome react more than others to create these negative conditions.

    EDIT: spelling...

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TH
    TheCulturedOtaku @beehaw.org
    Posts 0
    Comments 11