Debian 13 burning 10W playing 4K YouTube video on a Framework with max brightness 🫨
Debian 13 burning 10W playing 4K YouTube video on a Framework with max brightness 🫨
Debian 13 burning 10W playing 4K YouTube video on a Framework with max brightness 🫨
Obligatory: "Use Debian instead of Ubuntu. It's basically Ubuntu without Snap."
it was always wild to be back in the day when so many container images were based on ubuntu… was like PLEASE debian is functionally identical here at like 1/10th the base container size!
Mostly yes but there are functional differences in convenience. For example the standard upgrade process is completely manual. You have to disable third party repos. You have to change the repos. You have to check if you have space. You have to remove obsolete oackages. And more. On Ubuntu, the software update tool does all that, eliminating a lot of possibility for error. To an exoerienced user, the Debian process is fine. A novice would have plenty of opportunity for frustration and pain.
What? Software Center is GNOME, not Ubuntu. Discover is KDE, not Ubuntu. Debian updates can be done the same way? I don't do any of the things you mention. Using SC or just apt upgrade
works just fine.
It has much slower release cycle and ancient kernel. For people with new hardware it's not suitable.
Unless you prototype in a cpu fab it does not matter, debian 13 came out last week and its kernel is not that old
This is why Backports exists. You can get any newer packages or kernels you need by enabling it.
And Ubuntu LTS doesn't go much farther ahead than base Debian.
Pop_os
Bullshit
I prefer "ubuntu without the bullshit"
Is that good or bad? What cpu? How big is the screen? What encoding?
It's a Framework with 11th gen Intel i5. I've never seen it below 11W while doing this. I don't recall the exact number I got in Debian 12 but I think it was in the 11-13W range. The numbers were similar with Ubuntu LTS which I used till about a year ago. Now I see 9-10W. The screen is 3:2 13". Not sure about the enconding but I have GPU decoding working in Firefox.
Not sure about the enconding
Right click on video -> Stats for Nerds
It's a youtube video so whatever youtube is these days. I tested with this M1 Macbook Pro and it was using about 7 watts so 3 watts more is pretty good for pretty much anything. I think my 12th Gen. laptop typically draws about 13-15 doing the same thing, but with a much dimmer screen.
The screen was not measured.
My phone uses like 30 on idle 🫠
It fluctuated between 8.8W and 10.3W.
Me with an older notebook that doesn't support av1 decoding: 😭
There's a browser extension called "Your Codecs." which can prevent YouTube from serving you AV1-encoded videos.
I wish there were more M.2 cards beyond just SSDs and wireless NICs. The idea of a small form factor PICe interface is underutilized and things like hardware codec accelerators can keep laptops with older processors usable with new standards for longer. It's sad how PCMCIA had an entire ecosystem of expansion cards yet we somehow decided that the much higher bandwidth M.2 is only for storage and networking. Hell, do what sound cards in the 90s/00s did and have M.2 SSDs specifically designed for upgrading older laptops that also have built in accelerators for the latest media standards. Hardware acceleration is energy efficient and can probably just be bundled into the flash controller like they're bundled into the processor, and unless you have a top of the line SSD you're probably not saturating the M.2 interface anyway.
capitalism underutilizes tech and its sad. we could be in 2085 already if we didn't just waste time and materials on shit made to be thrown away in a few years.
I've seen 10-12W easily on 4K for soc without av1. your soc (intel 11 gen) should support av1. try to play the video on mpv (with yt-dlp integration) with various hw acceleration options to see if it changes. probably your browser is software decoding.
for hardware decoding supported soc too I noticed 2-3W of extra power usage when playing youtube from website compared to mpv or freetube. the website seems doing inefficient js stuffs but I haven't profiled it.
Av1 will probably increase power usage. It's made to reduce data consumption
on mobile platforms nowadays power is more important than data. OTOH for servers bandwidth is more important.
What command do you use to see the Watt used?
Powertop
( ͡° ͜ʖ ͡°)
and don't forget the calibration before use
What cpu architecture is this?
x86_64
ngl I expected to be ARM cause of the low power usage.
That is very impressive! Although to be honest I question the accuracy of all those estimated power draws. I would be interested to see an endurance test of your battery- assuming your battery capacity is accurate, your runtime on a full charge should line up with your power draw.
Honestly it's a little staggering how much better web video got after the W3C got fed up with Flash and RealPlayer and finally implemented some more efficient video and native video player standards.
<video>
was a revolution.Oh man, I was like a kid in a candy shop when I got my hands on Flash 4... built quite a few sites with it.
My unpopular opinion is that Flash was perhaps one of the greatest media standards of all time. Think about it — in 2002, people were packaging entire 15 minute animations with full audio and imagery, all encapsulated in a single file that could play in any browser, for under 10mb each. Not to mention, it was one of the earliest formats to support streaming. It used vectors for art, which meant that a SWF file would look just as good today on a 4k screen as it did in 2002.
It only became awful once we started forcing it to be stuff it didn't need to be, like a Web design platform, or a common platform for applets. This introduced more and more advanced versions of scripting that continually introduced new vulnerabilities.
It was a beautiful way to spread culture back when the fastest Internet anyone could get was 1 MB/sec.
Wasn't that when Whatwg took over the spec?
Ah I am not sure. I just assumed it was W3C.