I run a qemu/KVM setup in which I have different VMs for different use cases/profiles. Very similar in theory to something like Qubes OS. So far when I want to swap to another VM I have to first un-fullscreen, then click the other VM display window and fullscreen that. I was beginning to work on hotkeys and scripts to allow switching between VMs by assigning Ctrl+NumPad# to specific VMs and then having the triggered VM appear in full screen. But I'm imagining there's probably already a VM display manager that streamlines this.
Does anybody have any suggestions?
The biggest factor is that the display needs to be responsive as I'm using these VMs for daily tasks.
Bonus points if the display manager can output a variable for the currently focused VM so I can script the keyboard backlight to change to an assigned color as well as change the power profile of the base operating system to match the currently highlighted VM better.
I disagree. I think if the company wants to sell its customers' data, they need to first submit to installation of cameras in every room of each member of the board of directors private domicles, live streamed ala Fishtank style on the internet. I think that's a fair trade.
Yeah pimeyes absolutely needs to be shut down and laws need to be in place to protect private citizens from having their information sharable and searchable without their explicit consent. "Publicly available information" is always the line people use to defend these services. I'm arguing that our modern capabilities needs to be adjusted for. Things shouldn't be so publicly accessable in the first place and personal data aggregation should be a much more vetted and potentially licensed business. Can we talk about what other purpose these facial recognition databases serve other than to stalk, expose, or extort people? If they required proof of identity and only allowed searches of your own face then I could understand the value.
You're right. I always thought Peertube was another YouTube frontend.
I think the only real path forward is for a developer to figure out a way to decentralize video hosting. The future of the free internet is decentralization. We've seen which way the wind blows when platforms are centralized.
Consumer storage is abundant and cheap as hell. There will need to be incentives for: 1. Creators 2. Node hosters 3. Moderators. Potentially AI could do the heavy lifting on number 3. Figuring out a way to avoid ad based revenue would be another hurdle. In an ideal world, creators would accept that only 10% of their viewers would contribute to them monetarily (through patreon or donations) and use the platform for its freedom from corpo bullshit.
But as much as the Foss and decentralized crowd has been growing, I think we're still a long way out from average people becoming fed up enough to care. I still get eye rolls from everybody I know IRL when I try to get them to open an invidious link.
I'm in the same boat as you in that I need Instagram for work. My approach is to create a separate work profile in GrapheneOS. I handle all of my mobile work apps in that profile using a separate VPN from all other profiles. I don't expect to be completely free from tracking in this profile, but for my threat model I don't mind too much. Any web queries I make in this profile I keep strictly work related.
People arguing you just shouldn't use Instagram need to remember Instagram is a tool just like Windows, Adobe, etc. Sometimes you need a specific tool to do your job and I believe as long as you containerize that aspect of your life then you'll be fine.
Just don't use your work Instagram for personal stuff, not even browsing memes.
I accidentally attempted to SSH into one of my servers from a device that did not contain my ssh key. I configure all of my servers to only allow authentication via cryptographic keys. Root ssh as well as password auth are disabled.
To my surprise, I was able to log in to my server with a password despite this. Baffled, I first tried some other servers. 2 of the 5 other servers I tried were accessabke via password.
After some swift investigation the culprit was found, a cloud-init ssh config in sshd_config.d/ with one line: password_authentication Yes.
So TLDR PSA....if you run a server in any type of virtualized environment, including a VPS, check your /etc/ssh/sshd_config.d/ folder. And more broadly, actually thoroughly test your ssh access to confirm everything is working as you intend it to.
I was in your position recently and decided to install PVE from scratch and restore VMs from backup.
I had a fairly complex PVE config so it took some additional work to get everything up and running. But it was absolutely worth it.
I'm curious what the benefits are of paying for SSL certificates vs using a free provider such as letsencrypt.
What exactly are you trusting a cert provider with and what are the security implications? What attack vectors do you open yourself up to when trusting a certificate authority with your websites' certificates?
In what way could it benefit security and/or privacy to utilize a paid service?
And finally, which paid SSL providers are considered trustworthy?
I know Digicert is a big player, but their prices are insane. Comodo seems like a good affordable option, but is it a trustworthy company?
I don't want to be too specific for opsec reasons. But windows 10 is the OS. OFX aka OpenFX.
I'm familiar with Proxmox, virtualbox, and KVM/KVM manager.
If I want to set up a PC to virtualize multiple operating systems, but with the feel of a multiboot system, what virtualization software would you suggest?
My goal is for the closest I can get to a multiboot system (windows, Debian, fedora) but virtualized so I can make snapshots. It should feel like I'm on baremetal when inside the VM.
Virtualbox is clunky with lots of pesky UI cluttering the screen and Proxmox doesn't seem great for this use case.
Most important term to research regarding arr apps is "hardlinking". Make sure you have your apps configur ed with hardlinks. Everything else is pretty easy and self explanatory.
For those of you that know, I'm trying to find a niche community, forum, chat room, whatever of individuals that could give me some pointers on cracking an OFX plugin. My knowledge ends at simple standalone exes and the communities I know of seem largely focused on game cracking.
If you know of a community that you think would help me on my journey, feel free to share. You can also send me a private message if you need to be discrete.
I replaced the drives, installed the newest version of PVE, then restored all of my VMs from local USB backup. I had to reconfigure a number of things such as HDD pass through and other network settings, but in the end the migration was a success.
What do you recommend for an at-a-glance access log dashboard? Kibana and Grafana seem cool but overkill.
All I want is a dashboard that can ingest and parse syslogs from various services and neatly display a list of currently connected IPs and usernames if applicable as well as a IP connection history.
I don't work in IT at all. My self hosting journey started when I got sick of feeling powerless in the face of big tech companies who are increasingly ripping off customers or violating their right to privacy. There's also the general mistrust that comes from my data being repeatedly breached or leaked because share holder profits are more important than investing in basic security.
Yeah I would suggest buying another NIC. They're cheap, its good security, and it opens up another port upstream for other uses.
I use the pcie coral and it works fine with plenty of processing to spare although I believe mine coast e closer to $50. I have 6 amcrest PoE cameras. You should just buy a PoE switch and directly connect all cameras to it. Then link that directly to your frigate box and lock down access. Any amcrest camera should work well with frigate. I believe they all support rtsp protocol.
When I say local I mean automated PVE backups the same as it would be through PBS. If that makes any difference.
I have a remote pbs but the backups aren't current because there was a connection error. I have Proxmox backups locally to a USB thumbdrive. That's what I was going to restore from.
With the EOL of PVEv7 and my need for more storage space, I've decided to migrate my VMs to a larger set of drives.
I have PVE installed baremetal on a dell r720 RAID1 SSDs. I'm a bit nervous about the migration.
I plan on swapping the SSDs, installing PVE8 from scratch, then restoring VMs from backup.
Should I encounter an issue, am I able to swap the old RAID1 SSDs back in, or once I configure the new ones are the old drives done for? I'm managing RAID on a dell RAID controller.
I also have my data hard drives passed directly into a TrueNAS VM which supplies other VMs via NFS. Is there anything I should be concerned about when I've migrated, such as errors re-passing the data drives to the TrueNAS VM. Or should everything just work again?
Is there a master PVE config file I can download before swapping drives that I can reference when configuring the new PVE install?
So I use Fusion360 for the technical building of components; framing, drywall, cabinets.
I export this to 3dsmax and flesh it out for archviz. Rendering with V-ray.
Unfortunately there aren't any good options for pirating either of these softwares.
3dsmax and vray also have very steep learning curves.
There are also better alternatives than Fusion360 which include BIM features, but they're insanely expensive unless you own a profitable architecture firm.
This is the way. Frigate just had a major update and the UI is now amazing.
I had a few episodes saved offline in my apple podcast app but it appears you are correct. Surely there's an archive somewhere?
I was listening to a Bazzell podcast where he mentions his company self hosting and maintaining a database of personal data and credentials for use in OSINT investigations. Some acquired through public sources but others acquired through leaks. Then of course there are data aggregate companies that do the same but are going on to sell this data for a profit.
What is the legality of this? Obviously acquiring publicly available data is legal, but how are these companies able to hold on to leaked usernames, passwords, and other confidential personal information. Especially those that were initially acquired through illegal means?
Is there something like Spotify Downloader or yt-dlp for Lidarr?
I got spotify playlists imported into Lidarr, but the artists I listen to don't seem to have any torrents.
I don't mind the quality hit of something like spotifydownloader which pulls from youtube. Is there anything like that or yt-dlp integrated into Lidarr for automated downloads?
I'll start by stating my threat model is avoiding corporate tracking, profiling, and analytics. For anything beyond that scope I believe tor is ideal.
Correct me if I'm wrong but my understanding is that Newpipe is a frontend to provide an alternative to the awful YouTube app and/or youtube account. However, your IP along with other device information may still be exposed to google servers. Any ideas as to what info beyond IP is sent to google?
Whereas invidious instances act as a proxy in addition to what is offered by Newpipe, but you are trusting your privacy to the instance owner.
My idea for utilizing these services is the following: Newpipe for managing subscription based YouTube viewing. Google would have my IP, but this IP would be a VPN IP address that periodically changes. Much more reliable than invidious and better quality. App is great.
Invidious for random video searches as well as content I may want to be slightly more cautious about associating with.
I'm looking for feedback on this conceptual setup. I've also been considering making a public invidious instance that I can use but hopefully obfuscates my viewing through its usage by others.
spotify-downloader is great. I already have an arr stack running for movie and shows. It would be cool to add music to the mix.
I have a shared spotify playlist with friends that I pretty much listen to exclusively as of late. What I'd like is to have an arr app that constantly pulls from that playlist and downloads via spotify-downloader, so that I can listen to those songs from my private server and then I don't need to have spotify open so much.
The ideal setup would be a system where songs are pulled from a spotify playlist, downloaded via spotify-downloader, but later once a higher quality version is discovered, downloads that and replaces the youtube quality initial version.
I can't be the first to think of this, so I'm hopeful something like this is already ready to deploy. Thoughts?
I recently acquired a pixel phone and set up gos. Prior to trying gos I was using an iPhone hardened as much as possible based off of recommendations and guides from respected OSINT experts.
It’s only been a week but I’ve found gos extremely frustrating and mostly useless except for web browsing.
I can’t seem to get my Yubikey to work so my 2FA is borked. Works fine on my iPhone.
I’ve previously managed to degoogle my life but now certain apps require me to use sandboxed google apps just to run.
I’m facing the nearly insurmountable task of convincing my friends, family, and colleagues to download and use signal when they are all using encrypted iMessage.
Most of my banking apps just simply do not work. Mobile banking is unfortunately something important that I need in my occupation. A part of the appeal of gos was being able to have an isolated dedicated profile for banking.
There’s also a few features that I’m assuming are iPhone exclusive that it really sucks to have without. Double tapping the bottom of the screen to shift everything down so you can reach the top of the screen with your finger when using one hand. Holding down on the space bar to move the text cursor between characters. Maybe these exist on gos though?
I understand most of the issues lay on the shoulders of the app developers. I’m grateful for the devs for creating and working on this project. I’m not bashing anyone here. I’m simply asking for some guidance on how I can break through the hurdles and make this work for me, from the mouth of those who were once in my position.
I’ve been using invidious for a few years. I recently changed up my morning routine and have been eating breakfast watching YouTube via the TV app versus on my PC.
It made me realize I kind of miss the recommended videos in some circumstances like when I just wanna veg out.
Are there any current viable yt front ends that either maintain the algorithm or utilize their own to find you new content?
If you have an outdoor Ethernet port—in my case with a WiFi AP connected—how can you go about protecting your network from somebody jacking in?
Is there a way to bind that port to only an approved device? I figured a firewall rule to only allow traffic to and from the WiFi AP IP address, but would that also prevent traffic from reaching any wireless clients connected to the AP?
Edit: For more context, my router is a Ubiquiti UDM and the AP is also Unifi AP
What is the general consensus on trusting data removal services with the data you provide them?
I’ve spent 5 years telling myself I’ll go through the long lists of data aggregators and one by one manually send removal requests. But it’s such a massive undertaking. I’d like to finally get it done through one of these services, but my gut tells me it feels wrong.
Has anybody used them and how do you feel about it? Is DeleteMe a good choice?
I have a Dell Poweredge r720xd in RAID10. I've had a couple of drives fail since I've bought it and was able to buy cheap replacements on ebay.
I had another drive fail recently and one of the spare ebay drives came up as "blocked". It put me out a few days while I waited for a new one to arrive; also from ebay.
I'd like to avoid getting another dud drive. Are there any reputable resellers of these old drives so I can stock up on some spares?
I’ve made a few posts in the past about my experimentation with connecting various devices and servers over a VPN (hub and spoke configuration) as well as my struggles adapting my setup towards a mesh network.
I recently decided to give a mesh setup another go. My service of choice is Nebula. Very easy to grasp the system and get it up and running.
My newest hurdle is now enabling access to the nebula network at the same time as being connected to my VPN service. At least on iOS, you cannot utilize a mesh network and a VPN simultaneously.
TLDR: Is it a bad or a brilliant idea to connect my iOS device to a nebula mesh network to access for example my security camera server, as well as route all traffic/web requests through another nebula host that has a VPN such as mullvad on it so I can use my phone over a VPN connection while still having access to my mesh network servers?
As the title says, I'm trying to multiboot Fedora 40 and Ubuntu 24. The documentation and guides for this all seems pretty outdated through my searching and troubleshooting.
I currently have ubuntu installed. My drive partition table looks like this:
- sda1 -- EFI (250MB)
- sda2 -- /boot (ext4, 2GB)
- sda3 -- empty (ext4, 2TB) <-- Fedora partition
- sda4 -- Ubuntu 24 (LUKS encrypted, 2TB)
I'm trying to install Fedora now and it's giving me nothing but errors. The most useful guide I found for this specific setup just has you adding sda3 as the installation path (mounted at /) for Fedora and it's supposed to figure out the EFI and boot, but that doesn't happen. In fact, the EFI and /boot partitions show up under an "Unknown" tab in the Fedora custom partition window of the installation. They should be under a tab such as "Ubuntu 24 LTS". Fedora isn't recognizing the ubuntu installation (because it's encrypted?)
Am I wrong in assuming that both OS's should be sharing the EFI and /boot partitions? Maybe that's the issue?
Anybody out there successfully dual booting Linux distros with both distros encrypted?
For years I’ve had a dream of building a rack mounted PC capable of splitting its resources to host multiple GPU intensive VMs:
- a few gaming VMs
- a VM for work that can run Davinci Resolve and Blender renders
- an LLM server
- a Stable Diffusion server
- media server
Just to name a few possibilities…
Everytime I’ve looked into it, it seemed like the technology just wasn’t there yet. I remember a few years ago Linus TT took a shot at it, but in the end suggested the technology (for non-commercial entities) just wasn’t in a comfortable spot yet.
So how far off are we? Obviously AI focused companies seem to make it work, but what possibilities exist for us self-hosters who might also want to run multiple displays in addition to the web gui LLM servers? And without forking out crazy money for GPU virtualization software licenses?
HACS has a problem with hitting the GitHub rate limit when you first install it. It’s not really that big of a deal. You usually just need to wait an hour for the local database to populate.
It used to be optional to link your GitHub to HACS to bypass the rate limiting but now it seems the installation requires it.
I’m not a fan of this as somebody who uses Homeassistant for its privacy values and am kind of frustrated HACS removed the ability to install without a GitHub API key.
Is there a manual way to override the API linking process?
Would this work or would I have problems:
Using dd command to backup an entire SSD containing dual boot Windows/Ubuntu partitions into an .iso file, with the intent to then dd that iso back onto the same size SSD in the case of a drive failure?