Is it a bad idea to use my desktop to self host?
What are the disadvantages?? Can they be overcome?
I use it primarily for programming, sometimes gaming and browsing.
It’s a terrible idea - do it anyway. Experimentation is how we learn.
If you have a reasonably modern multi-core system you probably won’t even notice a performance hit. The biggest drawback is that you have a single thing that is holding all your eggs. So if an upgrade goes wrong, or you’re taking things down for maintenance then everything is affected. And there can be conflicts between required versions of libraries, OS, etc. that each service needs.
Separating services, even logically, is a good idea. So I’d recommend you use containers or VMs to make it easier to just “whelp, that didn’t work” and throw everything away or start from scratch. It also makes library dependencies much easier to deal with.
So I already host a lot of stuff on a raspberry pi 4B. But when I tried to host Jellyfin, encoding was trouble on it, so I used my desktop to host Jellyfin as a quick solution, but using sshfs from the raspberry pi to access the media files. So now I wonder, is it worth it moving Jellyfin to something else? Is it worth it moving the media files to the desktop?
Is it performing well as is? sshfs isn’t very high performance, but if it’s working it’s fine - nfs would likely perform better though. I run jellyfin in a vm with an nfs mount to my file server and it works fine. Interface is zippy and scanning doesn’t take too long. I don’t get GPU acceleration but the CPU on that system (10th gen i7 I think) is fast enough that I haven’t had much trouble with transcoding (yet).
It’s actually not bad, surprisingly. I have had issues sometimes, but they’re network issues related to my router. I haven’t had them in a while.
If it’s working - that’s fine. Creating dependencies can make things more complex (you now need two systems running for one service to work) - but also isolating ‘concerns’ can be beneficial. Having a single “file server” lets me re-build other servers without worrying about losing important data for example. It separates system libraries and configuration from application data. And managing a file-server is pretty simple as the requirements are very basic (Ubuntu install with nfs-utils - and nothing else). It also lets me centralize backups as everything on the file server is backed-up automatically.
Things can be as simple or as complex as you want. I will re-iterate that keeping a “one server per service” mindset will pay off in the long-run. If you only have your desktop and a Pi then docker can help with keeping your services well isolated from each other (as well as from your desktop).
It’ll effect performance. It will need to be always on. It risks having interaction between your normal applications and server services. Also all your eggs in one basket if something goes wrong. That said it shoul be fine. Just take frequent snapshots and backups for important data
If you are going to use your desktop, I would suggest putting all of the self-hosted services into a VM.
This means if you decide you do want to move it over to dedicated hardware later on, you just migrate the VM to the new host.
This is how I started out before I had a dedicated server box (refurb office PC repurposed to a hypervisor).
Then host whatever/however you want to on the VM.
I mean, I use a regular desktop computer that I just installed Ubuntu on and plugged it into an ethernet cable in the closet and closed the door. Now it’s my server. RGB and all.
RGB and all.
Proper server
By hosting services on your desktop, you are increasing your threat surface. Every additional software that you run increases your potential to catch malware. It also requires powering a beefy machine 24/7 to keep the service up, when in reality anything that isn’t a media server can run on 3rd gen Intel CPUs that have relatively low TDP.
conversely you could also run them on a low end chip of a current/recent gen and get even lower power draw for equivalent or better performance
Not false, but older parts tend to be cheaper.
And if you’re allergic to buying used, there’s always the mini computers.
Is power consumption a consideration? I want my self hosted server on 24/7, so a low-power single board is much more economical for me.
Also, are resources a problem? If your game is maxing out your rig and some batch job on a self hosted service starts, that could be annoying — or it could be a non-issue, just depends on your usage both as a desktop and a server.
I would not recommend using your primary desktop for self hosting. If you just absolutely have to, install Virtual Box or some other hypervisor solution and run your servers in separate VM’s.
Use a dedicated host. It can be a desktop, server, Raspberry Pi, etc. Depending on your needs. Sooner or later you’ll find that hosting on a workstation that you use for other things is horribly inconvenient. Depending on what you’re self hosting, it can consume lots of resources. If you become dependent on the services you’re hosting, which is the point of self hosting to begin with, even really small things like rebooting your workstation can become really inconvenient.
I’ve got an old Dell PowerEdge ticking away in my basement that runs all my VM’s. I can reboot my desktop without interrupting any of my self hosted services. It also makes it easier to back up my VM’s and I can easily spin up a new one if needed. You have to be careful if you use server hardware though. The T430 that I have is pretty efficient but some servers can be thirsty little space heaters.
I do and it’s fine.
I used to have a separate machine for server stuff but it just cost more in electricity since I would leave them both on 24x7 anyway.
I’ve got 64G of ram and I often use up to 48 of it with various VMs. I wouldn’t get any power savings with a separate server since I have a cron job to transcode everything that plex recorded off of TV during the day to av1 for disk space savings (usually turns 3GB of mpeg2 into 700MB of av1), so I would need a server with a moderately powerful cpu anyway for that.
I have a ryzen 3700X. got it since it was the highest performance that was still 65w tdp at the time, didn’t want to spend a ton on electricity and extra air conditioning since I would be leaving it running 24x7.
The only time I notice a performance impact during gaming is if my windows 11 vm is running, I don’t really need that one running 24x7 so I shut that one down if it happens to be running at the time.
What do you want to self host? To learn or experiment buy a cheap old x86 box. I get mine at goodwill auction. Otherwise desktop is good if you want something that needs more compute and that you’d spin up as needed vs always on.
Convenience is the main issue. AFAIK, as long as you secure your device, it’ll do the job
I would learn about mtbf if I were you. Everything has a failure point (generally buried deep in product info), and when you start keeping it on 24/7, the hours will burn. Those fans on your GPU will give it up way before they normally would.
Finding a used server is not a good idea either. You aren’t in a data center and servers are super loud. Also, they chew up electricity like a hungry dog and his dinner.
As others have said, find a used desktop somewhere and a cheap KVM switch so you can use the same peripherals for both. It doesn’t need to be beefy by any measure (maybe drive space), just affordable.
I use a “regular” desktop as my server. It uses much less power than most servers and still has plenty of horsepower for what I do.
Remote management and (cheap) ECC ram are the biggest reason to get a server. But those usually aren’t issues for most work loads, especially at home.
Shit I used to run my stuff off of a laptop with maxed out ram, and some people just have a raspberry pi and call it a day.
i think its kinda silly
i see workstation graveyards in closets and garages that would make perfectly good white box servers… yeah theyre retail shit that mostly lack the ‘always on’ resilience of server level hardware, but a dedicated box for a server process is always better than a shared user/server environment.
youd prolly be hard pressed to find any old shitty retail box that you couldnt slap a nix variant.
keep good backups… in my experience, hardware dies in this order: spinning drive, power supply, motherboard
I remember someone in this community selfhosting on an android phone, I think samsung s20
My first homelab was a synology NAS, and my gaming PC with a DIY linux hypervisor as the main OS, a linux VM for hosting servers, and a Windows/Mac/Linux VM trio (each with GPU passthrough) that I would switch between for my workstation. I lost performance for sure, but it taught me a lot without the need to purchase more hardware.
If you consider it temporary, it’s not a bad way to learn.