I built my own Linux environment on Windows using WSL2 (and you should, too)

As someone deeply immersed in AI and ML development, I’ve often found myself juggling multiple computing environments. My workday typically involves switching between a Windows laptop at work, my personal MacStudio (along with one Windows desktop) at home, and SSH connections to remote computing clusters for intensive training jobs or working on data files that should not move (ruled by DUA). This fragmentation created friction in my workflow that I was eager to solve.


The Multi-OS Challenge

The context switching became a real pain point. On Monday, I’d be deep in Windows-land with its PowerShell quirks and directory structures, by evening I’d transition to my Mac’s Terminal with its Unix-like commands, and then for model training, I’d SSH into Linux servers. Each environment has different terminal commands, environment variables, and package management systems.

Note: In fact, the latter two are relatively close though. Recall the core of Mac OS is Darwin which is POSIX-compatible, certified as compatible with SUSv3, and derived from BSD Unix variants.

For instance, a simple task like setting up a Python environment with the right dependencies could require slightly different approaches:

1
2
3
4
5
6
7
# On Windows
python -m venv env
.\env\Scripts\activate

# On Mac/Linux
python3 -m venv env
source env/bin/activate

This constant mental context switching was not only frustrating but also error-prone in some circumstances.


The Unix Preference

I gravitated toward Unix-like systems (Linux and macOS) for development work for several key reasons. Package management on Linux distros like Ubuntu is straightforward with apt. The command-line interface is powerful and consistent. Most importantly, many AI/ML tools and libraries are built with Unix-like systems in mind, making installation and usage more seamless.

Take CUDA installation for deep learning. On Linux, it’s often as simple as:

1
sudo apt install nvidia-cuda-toolkit

Whereas on Windows, it might involve multiple installers, environment variable configuration, and occasional incompatibilities.

Additionally, tools like GDB, Valgrind, and many compiler toolchains feel more at home in a Unix environment. When debugging memory issues in C++ code for a custom low-level kernel, having Valgrind readily available is invaluable.


The Corporate Windows Reality

Despite my Unix preference, my work laptop came with Windows, and institution policy prohibited dual-booting or replacing the OS. This is a common scenario in corporate environments where IT departments standardize on Windows for security, management, and support reasons.

I needed to find a way to create a Linux development environment within the constraints of my Windows machine.


The Cygwin Attempt

Before virtual machines, I tried Cygwin as my first bridge between Windows and Linux. This tool promised Unix-like functionality by providing recompiled Linux utilities that ran natively on Windows.

Initially, Cygwin seemed promising. I could use familiar commands like grep, sed, and awk, navigate with forward slashes, and write basic bash scripts. For simple text processing and automation tasks, it was adequate.

However, several limitations quickly became apparent:

  1. Compatibility gaps - Many libraries wouldn’t work because Cygwin only emulates Linux; it doesn’t provide a true Linux kernel or system call compatability
  2. Performance issues - Computation-heavy tasks ran significantly slower than on native Linux
  3. Package management frustrations - Installing and updating packages was cumbersome compared to apt, with many outdated or broken dependencies
  4. Path confusion - The /cygdrive/c/ prefix for accessing Windows drives created constant friction in scripts

I still remember wrestling with a Python deep learning framework that refused to compile its C++ extensions under Cygwin. After hours of debugging cryptic error messages, I realized I was forcing a square peg into a round hole.

While Cygwin introduced me to the power of Linux tools, it was ultimately just a partial solution—sufficient for basic scripting but inadequate for serious AI development work. I needed something closer to a real Linux environment, which led me to explore virtual machines.


The Virtual Machine Era

As Cygwin’s limitations became increasingly frustrating for my development needs, I turned to the next logical solution: full virtualization aka virtual machines. VMware Workstation became my tool of choice, running Ubuntu as a guest OS.

This approach worked but had significant drawbacks:

  1. Performance overhead - VMs require dedicated RAM and CPU resources
  2. Graphics acceleration issues - crucial for visualizing some ML results
  3. File sharing friction - moving data between host and guest was cumbersome
  4. Boot time - waiting for the VM to start disrupted my workflow

Most frustrating was the isolation. The VM felt like a completely separate machine, which sometimes was what I wanted, but often created unnecessary barriers. For example, Office software (usually Windows-based) couldn’t directly access analytic files and directories. Pandas (running on Linux) was not able to open some excel files I archive in Windows. Setting up shared directories could mitigate some of these issues, but the inconvinience persisted.


Enter WSL2: The Game Changer

When Microsoft announced WSL2 in 2019, I was skeptical. My experience with the original WSL had been somewhat mixed – slow filesystem performance and compatibility issues made it impractical for my workflow.

WSL2 changed everything. Instead of simply providing Linux-compatible APIs (as WSL1 did), WSL2 runs a real Linux kernel in a highly optimized virtual machine. The difference was immediately noticeable.

What impressed me most:

  1. Fast filesystem performance within the Linux environment
  2. Full system call compatibility allowing tools like Docker to run natively
  3. Integrated experience - I could access my Linux files from Windows Explorer and run Windows executables from the Linux command line
  4. Minimal resource overhead compared to traditional VMs
  5. *GPU passthrough for CUDA workloads

Note: It’s also worth noting that, while #5 would be highly critical for some ML development circumstances, it does not impact much to my current workflow – MacStudio or distributed clusters do such duties for me (plus, my work laptop is not equipped with NVIDIA GPUs). Nonetheless, I tested CUDA functionality with some old Turing architecture GPU in my old Windows desktop, which worked impressively smoothly

The seamless integration was particularly striking. I could type code . in my Linux terminal and have VSCode (a Windows application) open my Linux project files. The boundaries between the two operating systems began to blur.


Under the Hood: How WSL2 Actually Works

After diving deeper into their internals, leveraging what I leanred from computing system and OS courses, I gained a better understanding of what makes WSL2 so effective.

WSL2 uses virtualization technology but differs from traditional VMs in several key ways. It leverages a lightweight utility VM with a specialized Linux kernel optimized by Microsoft. This kernel contains specific patches that improve integration with the Windows host while maintaining full Linux compatibility.

The architecture uses several interesting mechanisms:

1. VFS and 9P File System Protocol

One of the most ingenious aspects of WSL2 is how it handles file system operations. Traditional virtual machines use solutions like shared folders, which are often slow and limited. WSL2 instead leverages the 9P protocol (originally from Plan 9) for file sharing between Windows and Linux.

The 9P implementation enables:

  • High-performance file access across OS boundaries
  • Preservation of Linux file permissions
  • Proper handling of symbolic links and other Unix-specific file attributes

When you access Windows files from Linux (in the /mnt/c directory), the requests are translated via this protocol. This translation happens at the VFS (Virtual File System) layer in the Linux kernel, making it transparent to applications.

2. VSOCK Communication Channel

WSL2 uses a virtualization socket (VSOCK) communication channel to facilitate fast, efficient communication between the Windows host and Linux guest. This channel powers several integration features:

  • Launching Windows applications from WSL
  • Network port forwarding
  • Process interoperability

This is why you can type explorer.exe . in your WSL terminal and have Windows Explorer open your current Linux directory.

3. Memory Management

Unlike traditional VMs that have fixed memory allocation, WSL2 uses a dynamic memory model. It starts with minimal memory usage and scales based on demand, up to a configurable limit. When memory pressure decreases, it releases resources back to the host.

This is implemented through what MS referred to as the Hyper-V architecture’s dynamic memory capabilities, with optimizations specific to the WSL2 use case.

4. GPU Passthrough

For AI/ML workloads, GPU acceleration is often critical. WSL2 implements GPU passthrough using a paravirtualized GPU driver model. This architecture allows CUDA workloads to run in Linux while still using the Windows NVIDIA drivers.

The implementation uses a specialized component called the WSL GPU driver that works with the DirectX driver on Windows to provide GPU access to Linux applications:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
+----------------------+      +----------------------+
| Linux Application    |      | Windows Application  |
+----------------------+      +----------------------+
| CUDA / OpenGL / etc. |      | DirectX / CUDA      |
+----------------------+      +----------------------+
| WSL GPU Driver       |<---->| DirectX Driver      |
+----------------------+      +----------------------+
| Linux Kernel         |      | Windows Kernel      |
+----------------------+      +----------------------+
              \                /
               \              /
          +--------------------+
          | Physical GPU       |
          +--------------------+

This approach delivers near-native GPU performance while maintaining the security boundaries of WSL2.


Application Ecosystem: Strategic Placement of Tools

After settling into WSL2, I discovered that choosing which tools to install where makes a significant difference in workflow efficiency. Through trial and error, I’ve developed a strategic approach to application placement that maximizes the strengths of both environments.

Windows-Side Applications

I keep these applications on the Windows side for optimal performance and integration:

  • Visual Studio Code: Although it can run in WSL, installing it on Windows with the WSL extension provides the best experience—smooth UI with full Linux integration under the hood
  • Web Browsers: Chrome, Firefox, and Edge remain on Windows where they’re optimized, but I can still launch them from WSL with a simple explorer.exe command
  • Office Suite: Excel, PowerPoint, and other Office tools stay in Windows, but I ensure project data is accessible to both operating systems
  • Communication Tools: Slack, Teams, and Zoom perform better as native Windows applications, especially for screen sharing and notifications

Linux-Side Applications

These tools belong in the WSL environment, where they perform best:

  • Development Languages: Python, R, Rust, and Node.js live in Linux for compatibility with deployment environments and to leverage native performance
  • Data Science Libraries: TensorFlow, PyTorch, pandas, and other libraries install cleanly in WSL with a simple apt install or pip install—no Windows dependency headaches
  • Command-Line Tools: Git, curl, wget, and other CLI utilities feel at home in WSL with their full feature sets intact
  • Databases: PostgreSQL, MongoDB, and Redis install more smoothly in WSL and better match production environments
  • Docker: Containerization just works better in Linux—it’s faster, more stable, and avoids the compatibility issues I experienced with Windows Docker Desktop

The Bridge Between Worlds

Some of my favorite integration points have been:

  • Using Windows Terminal as my unified console for both environments
  • Setting up VS Code to seamlessly work with WSL projects
  • Configuring quick shortcuts to access Windows files from Linux and vice versa
  • Creating aliases in my .bashrc to launch Windows applications directly from the Linux command line

This strategic division ensures that each application runs in its optimal environment while maintaining a cohesive workflow. In practice, I frequently have a Linux terminal running data processing scripts right alongside Excel visualizing the results—each tool playing to its strengths without conflict.

The beauty of this approach is that it’s flexible. As WSL2 evolves, some applications that once performed better on Windows now run perfectly well in Linux. I periodically reevaluate where each tool lives, always optimizing for the smoothest possible workflow across this hybrid environment.


My WSL2 Quick Setup Guide

Let me share the setup that works exceptionally well for my use cases.

Install WSL2

First, enable the WSL feature (necessary for earlier Windows 10) and install the Linux distribution:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
# More conventional command line app (cmd.exe) will work equivalently

# Enable WSL and Virtual Machine Platform features
# You may not need this if using Windows 11
dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart
dism.exe /online /enable-feature /featurename:VirtualMachinePlatform /all /norestart

# Restart your computer

# Set WSL2 as default
wsl --set-default-version 2

# Install
# By default, WSL2 installs Ubuntu distro (you can specify which linux/distro to install with --distribution or -d option)
# Below is effectively identical to "wsl --install -d Ubuntu"
wsl --install

After the installation, it is encouraged to update components to the latest versions.

1
2
3
4
# Open terminal emulator with Ubuntu via WSL2

# Update and upgrade packages
sudo apt update && sudo apt upgrade

Configure WSL2 Global Settings (Optional)

Some may want to create a .wslconfig file in my Windows user directory to optimize performance:

1
2
3
4
5
[wsl2]
memory=12GB
processors=6
localhostForwarding=true
kernelCommandLine=net.ifnames=0

This example configuration:

  • Limits WSL2 to 12GB of RAM (adjust based on your system)
  • Allocates 6 CPU cores
  • Enables localhost forwarding for web development
  • Uses predictable network interface names

Configure GPU Acceleration (Optional)

For NVIDIA GPU support:

1
2
3
4
5
6
7
8
# Update package list
sudo apt update

# Install CUDA development tools
sudo apt install -y cuda-toolkit-11-8

# Verify installation
nvidia-smi

Git Setup

I use Git to manage versions across projects and GitHub for collaboration and remote repository.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
# Install Git
sudo apt install git

# Configure Git
# Replace with your user name and email address
git config --global user.name "Your Name"
git config --global user.email "your.email@example.com"

# Set up SSH keys for GitHub/GitLab
# Replace with your email address
ssh-keygen -t ed25519 -C "your.email@example.com"

C/C++ Environment Setup

Let’s install complilers and tools I use for C or C++ developments.

1
2
sudo apt update && sudo apt upgrade
sudo apt install build-essential gcc g++ clang gdb valgrind cmake

Python Environment Setup with uv

I’ve recently switched to using uv, a drastically faster alternative to pip and venv, which has significantly improved my Python workflow:

1
2
3
4
5
6
7
# Install Python 3.13
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.13 python3.13-dev python3.13-venv

# Install uv
curl -sSf https://astral.sh/uv/install.sh | bash

And here is how to use it:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
# Make and move to project directory
mkdir some_project && cd some_project

# Create a new environment with uv
uv venv .venv

# Activate the environment
source .venv/bin/activate

# Install packages with uv pip
# much faster than regular pip!
# you can list as many packages as necessary (but over-installation is discouraged)
uv pip install numpy pandas

# Create a requirements.txt file
uv pip freeze > requirements.txt

# Install from requirements file (with parallelism)
uv pip install -r requirements.txt

The speed difference is remarkable. What used to take minutes with pip now completes in seconds with uv. For large ML libraries like TensorFlow and PyTorch, this efficiency is a game-changer, especially when I am frequently creating new environments for different projects.

I would like to discuss more about uv in depth. A dedicated article about uv-based environment and dependency managements will come later.

Rust Environment Setup

Like I posted multiple times in this blog, I’ve been into developer experience with Rust. Elegant expressions, strict error checking at compile time, and memory safety made me regard it as a safer alternative to C/C++. And for that reason, I am very ambitious to use and promote Rust in AI/ML space. Anyway, I’d also introduce how I setup Rust in Linux as well.

First, ensure you have the necessary build tools installed:

1
2
3
4
5
sudo apt update && sudo apt upgrade

# You may need the cc linker which can be installed with the build-essential package
# Without this, you may encounter linking errors when compiling some Rust projects
sudo apt install build-essential

Rustup is the recommended tool for installing and managing Rust:

1
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

After installing, you’ll need to update your current shell to include Cargo:

1
source $HOME/.cargo/env

Rust tools are installed to the ~/.cargo/bin directory, and it’s customary for Rust developers to include this directory in their PATH environment variable. The installation should attempt to configure this automatically.

For permanent configuration, add this line to your shell profile file (~/.bashrc or ~/.zshrc):

1
2
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc

Confirm that Rust and Cargo are properly installed:

1
2
3
rustc --version
cargo --version
rustup --version

7. Docker in WSL2

Docker runs natively in WSL2, which is perfect for containerized workflows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
# Install Docker
sudo apt install -y apt-transport-https ca-certificates curl software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
sudo apt update
sudo apt install -y docker-ce

# Add user to docker group to avoid sudo
sudo usermod -aG docker $USER

# Start Docker service
sudo service docker start

Docker is a very interesting virtualization tool. It is highly important in several key areas: deployments for developers, reproducibility for researchers, standardized environments for IT/Tech educators, etc. Thus, like uv, I would also like to discuss more about this, preparing a dedicated article to come later as well.

8. Filesystem Performance Optimizations

By default, accessing Windows files from WSL2 can be slow. I create a symlink for frequently accessed project folders:

1
2
3
4
5
# Create directory for projects in Linux filesystem
mkdir -p ~/projects

# Mount Windows directory efficiently
sudo mount -t drvfs D:\\Projects /home/username/projects

Terminal Setup

I use Windows Terminal with oh-my-zsh for a productive command-line experience:

1
2
3
4
5
6
7
8
9
# Install Zsh
sudo apt install -y zsh

# Install Oh My Zsh
sh -c "$(curl -fsSL https://raw.githubusercontent.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"

# Add WSL-specific configurations to .zshrc
echo 'export DISPLAY=$(grep -m 1 nameserver /etc/resolv.conf | awk "{print \$2}"):0' >> ~/.zshrc
echo 'export LIBGL_ALWAYS_INDIRECT=1' >> ~/.zshrc

Setup for VSCode (if you use it)

If you plan to use Visual Studio Code with WSL:

  1. Install VS Code on Windows
  2. Install the “Remote - WSL” extension
  3. Open your WSL terminal and navigate to your project
  4. Type code . to open VS Code with WSL integration

VS Code will automatically detect your venv environment when properly set up, making it easy to use your Python interpreter and installed packages.

To improve experience, you may want or need to install the right sets of extensions based on the compilers or tools of your choice, like rust-analyzer for Rust developments.


Balancing Perspectives

While WSL2 has dramatically improved my workflow, it’s not without limitations. Network performance can still lag behind native Linux, particularly for workloads with high I/O. Some specialized hardware may not be accessible. And occasionally, you’ll encounter edge cases where the Linux/Windows boundary becomes apparent.

For instance, I’ve experienced issues with filesystem locks when the same files are accessed simultaneously from both Windows and Linux. Docker performance, while much better than in WSL1, can still fall short of native Linux in high I/O scenarios.

However, the benefits have far outweighed these occasional hurdles. I can now use the same familiar Linux tools and commands across all my development environments while maintaining access to Windows-specific applications when needed.


Conclusion

WSL2 has fundamentally changed how I approach development on Windows. What started as a stopgap solution has evolved into my preferred environment for AI and machine learning work. The ability to leverage both ecosystems—Windows for its enterprise integration and Linux for its development tooling—has eliminated much of the friction in my cross-platform workflow.

The recent addition of tools like uv for Python package management has further streamlined my development process, while VS Code’s extensive extension ecosystem enables language-specific optimizations for Python, Rust, and C/C++ development.

For those working in corporate environments with Windows machines but craving a Linux development experience, WSL2 offers a compelling solution. It’s not perfect, but it’s remarkably close—close enough that I no longer feel the need to dual-boot or manage full virtual machines for most of my development work.

As Microsoft continues to invest in WSL2, the experience will only improve. The boundaries between these once – competing operating systems continue to blur, and for developers who need to span both worlds, that’s something worth celebrating.