Skip Navigation
PNG has been updated for the first time in 22 years — new spec supports HDR and animation
  • JXL is badly supported but it does offer lossless encoding in a more flexible and much more efficient way than png does

    Basically jxl could theoretically replace png, jpg, and also exr.

  • This happened one day in the window well outside my office. I got banned from r/pics for posting it.
  • I think it's just that the iPhone processing gives it a certain look that people associate with ai for possibly incorrect reasons.

  • Desktop Linux distros similar to Steam OS?
  • Software support is basically identical across any Linux distro. It's not really a concern when choosing a distro to use. Of course some are easier to install stuff on than others.

  • What hardware does not support Linux?
  • Not going to surprise anyone but Windows Mixed Reality VR headsets aren't great on Linux, at least with controllers

    Although that is improving!

  • Who's in charge?
  • Hmm I just tried editing some systemd service with Kate and it did actually give me an authenticator popup when I tried to save it

    Although then the prompt expired and now it does nothing when I try to save it. Restarted Kate and now it works again...

    I haven't tried that before

    When I try to go into the sudoers.d folder tho it just says I can't, and the same thing happens when I try to open the sudoers file in Kate. If I try to copy and paste a systemd service in dolphin tho it just says I don't have permission and doesn't give a prompt.

    lol if I open it with nano through sudo it says 'sudoers is meant to be read only'

  • Who's in charge?
  • Yeah, when I was on xfce on Arch I remember going into some places in the file manager where it wouldn't let me edit files etc without running it from the terminal through sudo.

  • Who's in charge?
  • Is there a technical reason that Linux apps can't/don't just pop up an authenticator thing asking for more privileges like Windows apps can do? Why does nano just say that the file is unwriteable instead of letting me increase the privileges?

  • Games run faster on SteamOS than Windows 11, Ars testing finds
  • They're already going to only ship it through Steam. As long as you're using Steam, they don't care.

  • Games run faster on SteamOS than Windows 11, Ars testing finds
  • You could use Nsight, it has a Linux version and is very in depth (shows every draw call, also has one that shows very detailed CPU tasks)

    Of course harder to use than presentmon

  • Anon turns on raytracing
  • It says on that page that SHaRC requires raytracing capable hardware. I guess they could be modifying it to use their own software raytracing implementation. In any case it's the exact same math for either hardware or software raytracing, hardware is just a bit faster. Unless you do what lumen did and use a voxel scene for software raytracing.

  • Anon turns on raytracing
  • Yeah, that's just rasterized shadow mapping. It's very common and a lot of old games use it, as well as any modern game. Basically used in any non-raytraced game with dynamic shadows (I think there's only one other way to do it, just directly projecting the geometry, only done by a few very old games that can only cast shadows onto singular flat surfaces).

    The idea is that you render the depth of the scene from the perspective of the light source. Then, for each pixel on the screen, to check if it's in shadow, you find it's position on the depth texture. If it's further away than something else from the perspective of the light, it's in shadow, else it isn't. This is filtered to make it smoother. The downside is that it can't support shadows of variable width without some extra hacks that don't work in all cases (aka literally every shadow), to get sharp shadows you need to render that depth map at a very high resolution, rendering a whole depth map is expensive, it renders unseen pixels, doesn't scale that well to low resolutions (like if you wanted 100 very distant shadow catching lights) etc.

    Raytraced shadows are actually very elegant since they operate on every screen pixel (allowing quality to naturally increase as you get closer to any area of interest in the shadow) and naturally support varying shadow widths at the cost of noise and maybe some more rays. Although they still scale expensively with many light sources, some modified stochastic methods still look very good and allow far more shadow casting lights than would ever have been possible with pure raster.

    You don't notice the lack of shadow casting lights much in games because the artists had to put in a lot of effort and modifications to make sure you wouldn't.

  • Anon turns on raytracing
  • I heard the Source 2 editor has (relatively offline, think blender viewport style) ray tracing as an option, even though no games with it support any sort of real time RT. Just so artists can estimate what the light bake will look like without actually having to wait for it.

    So what people are talking about there is lightmaps, essentially a whole other texture on top of everything else that holds diffuse lighting information. It's 'baked' in a lengthy process of ray tracing that can take seconds to hours to days depending on how fast the baking system is and how hard the level is to light. This just puts that raytraced lighting information directly into a texture so it can be read in fractions of a millisecond like any other texture. It's great for performance, but can't be quickly previewed, can't show the influence of moving objects, and technically can't be applied to any surface with a roughness other than full (so most diffuse objects but basically no metallic objects, those use light probes and bent normals usually, and sometimes take lightmap information although that isn't technically correct and can produce weird results in some cases)

    The solution to lighting dynamic objects in a scene with lightmaps is through a grid of pre baked light probes. These give lighting to dynamic objects but don't receive it from them.

  • Anon turns on raytracing
  • Still, even if any thread looks like it's always at 60%, if a load appears and disappears very quickly and gets averaged out on the graph (as it could in an unoptimised or unusual situation) it could still be a factor. I think the only real way to know is to benchmark. You could try underclocking your CPU and see if the performance gets worse, if you really want to know.

  • Anon turns on raytracing
  • Really? Ambient occlusion used to be the first thing I would turn on. Anyways, 4k textures barely add any cost to the GPU. That's because they don't use any compute, just vram, and vram is very cheap ($3.36/GB of GDDR6). The only reason consumer cards are limited in vram is to prevent them from being used for professional and AI applications. If they had a comparable ratio of vram to compute, they would be an insanely better value compared to workstation cards, and manufacturers don't want to draw away sales from that very profitable market.

  • Anon turns on raytracing
  • I haven't personally played a game that uses more than one dynamic reflection probe at a time. They are pretty expensive, especially if you want them to look high resolution and want the shading in them to look accurate.

  • Anon turns on raytracing
  • That's true, but after a few frames RT (especially with nvidia's ray reconstruction) will usually converge to 'visually indistinguishable from reference' while light probes and such will really never converge. I think that's a pretty significant difference.

  • Anon turns on raytracing
  • RT was three generations ago, and I don't think they really vary the number of rays much per environment (and rt itself is an o(log(n)) problem)

  • Anon turns on raytracing
  • There are cases where screen space can resolve a scene perfectly. Rare cases. That also happen to break down if the user can interact with the scene in any way.

  • Anon turns on raytracing
  • If course, no renderer is really good enough unless it considers wave effects. If my game can't dynamically simulate the effect of a diffraction grating, it may as well be useless.

    (/s if you really need it)

  • Anon turns on raytracing
  • Unless you consider wireframe graphics. Idk when triangle rasterization first started being used, but it's more conceptually similar to wireframe graphics the ray tracing. Also, I don't really know what you mean by 'fake it with alpha'.

  • double slit rule

    What New York might look like with a double slit as your camera aperture.

    Original picture:

    !

    Double slit kernel:

    !

    What an eye might see, for comparison:

    !

    Here's a different, big double slit:

    !

    2
    rust rule

    in the new minecraft april fools snapshot

    it makes your gear degrade quicker with damage

    3
    deepseek r1 with the prompt "why do I"

    With the smaller 14b model (q4_k_m), just letting it complete the text starting with "why do I"

    ! ! ! ! !

    edit: bonus, completely nonsensical (?) starting with "I don't" (what could possibly be causing it to say this?)

    !

    3
    "secure screenshot" idea

    I was thinking about how hard it is to accurately determine whether a screenshot posted online is real or not. I'm thinking there could be an option in the browser to take a "secure screenshot", which would tag the screenshot with the date, url, and whether the page was modified on your computer. It could then hash both the tag and the image data and automatically upload this hash to some secure server somehow. There would need to be a way to guarantee that only the browser could do this, or at least some way to tell exactly what the source was. I'm not much of a cryptography person, but I would be surprised if it isn't possible to do this. Then, you could check if the screenshot you see is legitimate by seeing if it's hash exists in the list of real hashes.

    17
    Cartography Anarchy @lemm.ee AdrianTheFrog @lemmy.world
    Better Organized Europe

    I'm sure everyone's fine with this

    47
    Does anyone else find it weird that the bottom of a graphics card is always the good looking side?

    reference image if you have no idea what I'm talking about:

    !

    I know this is a minor nitpick, but it's something that annoys me.

    !

    I got this graphics card mostly because it was the best deal on Amazon at the time (gpu shortage), and I also thought it looked decent from the images they had. However, when I actually installed it, all I see is the relatively unattractive looking black metal backplate with some white text. The other side is always the side shown in the promotional images too - not a single one of the pictures in the Amazon listing even shows the side that you'll be seeing 99.9% of the time. Do they think everyone hangs their PCs above them from the ceiling, or has open-air testbenches? Why do they never even bother with the other side? I know they want the fans on the bottom so the cooling is better, but the air in front of the CPU shouldn't be that bad, a lot of cheaper GPUs don't need that much cooling, and a ton of people have watercooling now anyways so the CPU radiators just go on the sides.

    14
    colors rule

    my reasoning: the actual colors we can see -> the wavelengths that we can extrapolate to -> basically extrapolated wavelengths plus an 'unpure-ness' factor -> not even real wavelengths (ok well king blue and maybe lavender if I'm being generous could be)

    50
    2024 Fediverse canvas Atlas @toast.ooo AdrianTheFrog @lemmy.world
    BeamNG Logo

    { "id": 7384484874, "name": "BeamNG logo", "description": "The logo of the game BeamNG.Drive, a softbody physics based realistic driving simulator.", "links": { "lemmy": [ "!beamng@lemmy.world" ], "website": [ "https://beamng.com/game/" ], "subreddit": [ "BeamNG" ] }, "path": { "0": [ [ 804, 294 ], [ 804, 294 ], [ 804, 292 ], [ 805, 291 ], [ 810, 289 ], [ 815, 289 ], [ 815, 291 ], [ 814, 293 ], [ 816, 295 ], [ 816, 298 ], [ 815, 299 ], [ 812, 299 ], [ 810, 297 ], [ 805, 297 ], [ 804, 295 ] ] }, "center": { "0": [ 810, 293 ] } }

    4
    ... rule

    Just 3% less votes than Jill Stein, and he dropped out 3 months ago

    2
    the 'look, we have global illumination!' box

    I've often seen this sort of thing in videos advertising GI in minecraft shaders, and tried it out in blender.

    5
    RealSense depth camera in ticket machines

    This is at JFK, does anyone know what they are used for? There wasn’t an obvious time when it was taking a picture.

    !

    !

    19
    Pixart-Σ just feels so much nicer to use

    Prompt: A cyberpunk scifi painting of a floating city in the air above the sea

    It uses a new, fancier, 18GB text encoder (t5) to follow the prompt much more closely. It isn't perfect, but its much better than SDXL in my opinion. It does seem to be a bit worse at photorealistic subjects and has a tendency to create 1-pixel vertical lines.

    Some other images:

    !

    impressionist, a woman sits in the middle of a crowded cyberpunk street, people bustling around, orange and blue glowing signs, warm atmosphere

    !

    a bright cinematic photo of a solarpunk city at midday, skyscrapers, steel, glass, vines and fields of vivid tropical plants

    1
    SDXL Turbo generated all of these images in 10 seconds

    I get around 1 image every quarter of a second on my 3060. The quality isn't up to par with regular SDXL (not even close) but it follows prompts well and is extremely fast. Here are some of the best images in this batch:

    ! ! !

    Prompt: "impressionist oil painting, watercolor, a crying old southern man eats cheese at sunset in front of a futuristic dystopian cyberpunk city"

    5
    AdrianTheFrog AdrianTheFrog @lemmy.world

    e

    Posts 25
    Comments 791