Hey, howdy, hallo,

What a year so far.

I haven’t talked about AI much (or at all). But with the recent release of Deepseek, now seemed like as good a time as any to talk about some of my thoughts on it.

🤓 How I use AI

I have a subscription to the ‘Plus’ ChatGPT plan. The main things I use it for are brainstorming and coding questions. There are privacy risks with using a service like this. While you can read their privacy policy and understand how they’re using your data, you never really know for sure. They also essentially stole everyone’s public work on the internet, but that’s a whole other topic.

TL;DR - I use it for specific tasks and don’t put ANYTHING private in.

🇨🇳 Don’t use Deepseek (Web Version)

I think it should go without saying, but don’t knowingly use something that instantly sends all your data, interaction habits, keystrokes, etc., to the CCP (Chinese Communist Party). This is just a personal rule; your views may differ.

🦙 A better option

This is something I’ve been exploring recently since I wanted to try DeepSeek (without the web version). Props to them—they released it as open source, which means you can run the models locally on your computer. The models you can run will depend heavily on your hardware.

I’ve been using Ollama. Once you download and install it, you can search through the available models and download the one you want to use. I also use open-webui which gives you a "ChatGPT-like" interface. (I know this is a lot, so let me know if you’d like to see a video on the setup)

☹️ Downsides of running models locally

You likely don’t have enough hardware to run full AI models locally. But there are other options! I’m no expert on this, but here’s my quick explanation.

For example, the deepseek-r1 full model has 671 billion parameters. You need some serious hardware to run that in any usable state.

If you visit that link, you’ll see models with the following number of parameters:

These models are distilled versions of the full 671B model. The smaller the number, the fewer parameters. Each parameter is a learned piece of data that helps the model process information and generate responses.

I’ve been running the 32B & 70B model’s locally, and they do surprisingly well at answering questions and handling coding tasks.

While these models aren’t as good as the full versions, they still perform incredibly well. Because they’re distilled, you can run them on everyday consumer hardware.

🏆 Positives of running models locally

PRIVACY - Running a model locally means all your questions, keystrokes, and behaviors stay on your device. No data is sent to a third party.

COST – If you already own the hardware, you can save money by not having to pay a third party for a subscription.

FUN - If you’re like me, you might just enjoy doing it.

There’s a lot more to this topic, but I just wanted to provide enough information to get you started. Give it a try—it’s easier than you think.

I hope you had a great January, and I’ll see you in February!

-Josh



🧠 A website worth visiting

Looking to test your brain? This site has some fun games to play.

🎤 My latest podcast episodes

The first episode of Season 2 drops on Monday! Want to be notified when it goes live? Subscribe wherever you listen to podcasts.

🎬 My latest videos

🖥️ The Big Problem with Bitwarden Backups — I had an issue with my Bitwarden backup and lost data. I shared my story to hopefully prevent someone from making the same mistake.

🎙️ no BS podcast hosting

🟡 Yellowball is a podcast hosting service I built and run. I didn’t like the options out there when I wanted to start my podcast, so I built the service I wish existed. It’s where I host my show, In the Shell. If you’re interested in starting your own show, checkout https://yellowball.fm for more information, or reply to this email if you have any questions about it.

✍️ Quote of the Month

“Footfalls echo in the memory Down the passage which we did not take Towards the door we never opened” T.S. Eliot

I recently finished the TV series Dark Matter. If you’re looking for something to watch, I highly recommend it. That quote was from the show. (The book is also great!)


🔬 What did you think?

I don’t track or analyze these emails, so I have no way of knowing if anyone reads them. If you enjoyed this email, feel free to reply with a ❄ and if you didn’t, write back one sentence on what you would change.