ESCAPE FROM MS WINDOWS ETC.
Governments Plan to Force Big Techs to Read Our Computer Screens
The above video warns that Governments are starting to require that Big Tech read everyone’s computer and phone screens to see what everyone is watching at all times, supposedly to make sure no one is watching child porn or other illegal things. And they will use their own AI to do that to monitor us. There will be absolutely no privacy. Unless we switch to Linux and use local AI instead of Big Tech AI. Ironically, I got help from Big Tech AI to tell me about local AI.
The guy above said he used to work with Bill Gates’ Microsoft company. In the next video he says not to get Windows 11. I already have it on one computer, this one, but I’m planning now to get a cheap Linux.
And in the next one he talked about switching to Linux.
Now we’re ready to discuss Local AI.
LOCAL AI INSTEAD OF BIG TECH AI
Local AI refers to AI models and platforms that you can run on your own hardware instead of relying on cloud-based services. In 2025, several options are popular for deploying AI locally, offering advantages in privacy, cost control, and customization.
Top Local AI Options in 2025:
LocalAI (localai.io): An open-source, free platform for running language models, autonomous agents, image/audio generation, and semantic search all on your local machine. It’s OpenAI API compatible and designed for consumer-grade hardware.
LM Studio, Jan AI, Ollama: Popular user-friendly local LLM apps for personal or business use with varying requirements and strengths.
Llama 2 and 3, Mistral 7B: Leading open-source LLM models that support local deployment on modern GPUs.
Local AI lets you avoid cloud subscription costs while fully controlling your AI environment and data privacy. Hardware like an RTX 4080+ or equivalent typically provides a great experience for these local models.
Where to start:
Visit LocalAI’s website and GitHub for downloads and install guides.
Explore forums like Reddit’s r/LocalLLaMA for user experiences.
Decide on your hardware capability to select the right model size.
Local AI is a growing trend for privacy and cost-effective AI usage in 2025, and the software ecosystem is rapidly improving for ease of use and performance.
Here are the official links for the local AI platforms and tools mentioned:
LocalAI - An open-source local AI stack compatible with OpenAI API
Website: https://localai.ioGitHub/Docs: https://github.com/mudler/LocalAGI
LocalAGI - Autonomous AI agent platform that integrates with LocalAI
Info: Included on LocalAI site and GitHub aboveOllama - User-friendly local AI platform for running powerful models on desktop
Info & GitHub: https://ollama.com (official site)
Setup guide/example: https://christianlempa.de/docs (YouTube tutorial available:
LLaMA, Mistral, and other models can be downloaded and run locally via huggingface.co and integrated with these platforms.
These tools allow running AI models fully on your hardware without cloud reliance, ensuring privacy and cost control. Community forums like Reddit’s r/LocalLLaMA are great for comparing options and getting support.
Starting with LocalAI and Ollama gives a solid foundation. Both have active development and good documentation to help get local AI running smoothly.
LINUX UBUNTU
Ubuntu is apparently the OS or operating system most like Windows, so that’s the one recommended to replace Windows.
