Visual Studio Code and AI-Powered Development¶

This work is licensed under a Creative Commons Attribution 4.0 International License.
Overview¶
Visual Studio Code (VS Code) has become the world's most popular code editor, used by over 70% of developers worldwide. Its extensibility, cross-platform support, and vibrant ecosystem make it an ideal foundation for AI-powered development workflows.
This guide covers:
- VS Code installation on all major platforms
- Alternative VS Code-based IDEs (Positron, Google Antigravity)
- AI coding extensions (Claude Code, GitHub Copilot, Cline, Roo Code, and more)
- Local AI integration with Ollama
- Practical workflows for academic users
Why VS Code for AI-Assisted Coding?
VS Code offers unique advantages for AI-powered development:
- Extensive AI extension ecosystem - Choose from dozens of AI assistants
- Integrated terminal - Run AI-generated code without leaving the editor
- Multi-file editing - AI tools can understand and modify entire projects
- Git integration - Version control your AI-assisted work
- Free and open-source - No licensing costs for the base editor
- Cross-platform - Same experience on Windows, macOS, and Linux
1. Installing Visual Studio Code¶
Windows¶
- Download the VS Code installer for Windows
- Run the downloaded
.exefile - Follow the installation wizard:
- Accept the license agreement
- Choose the installation location (default is recommended)
- Select additional tasks:
- Add "Open with Code" to context menu
- Register Code as an editor for supported file types
- Add to PATH (important for command-line use)
- Click Install and then Finish
macOS¶
- Download VS Code for macOS
- Open the downloaded
.zipfile - Drag Visual Studio Code.app to the Applications folder
- Launch VS Code from Applications or Spotlight
- (Optional) Add to Dock for quick access
Enable Command Line:
- Open VS Code
- Press Cmd+Shift+P to open the Command Palette
- Type "shell command" and select Shell Command: Install 'code' command in PATH
- Now you can open files with
code filename.pyfrom Terminal
Linux¶
# Download and install the Microsoft GPG key
wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > packages.microsoft.gpg
sudo install -D -o root -g root -m 644 packages.microsoft.gpg /etc/apt/keyrings/packages.microsoft.gpg
# Add the VS Code repository
sudo sh -c 'echo "deb [arch=amd64,arm64,armhf signed-by=/etc/apt/keyrings/packages.microsoft.gpg] https://packages.microsoft.com/repos/code stable main" > /etc/apt/sources.list.d/vscode.list'
# Update and install
sudo apt update
sudo apt install code
# Verify installation
code --version
# Import the Microsoft GPG key
sudo rpm --import https://packages.microsoft.com/keys/microsoft.asc
# Add the VS Code repository
sudo sh -c 'echo -e "[code]\nname=Visual Studio Code\nbaseurl=https://packages.microsoft.com/yumrepos/vscode\nenabled=1\ngpgcheck=1\ngpgkey=https://packages.microsoft.com/keys/microsoft.asc" > /etc/yum.repos.d/vscode.repo'
# Install VS Code
sudo dnf check-update
sudo dnf install code
# Verify installation
code --version
Verify Installation¶
After installation, verify VS Code is working:
# Check VS Code version
code --version
# Open current directory in VS Code
code .
# Open a specific file
code myfile.py
2. Alternative VS Code-Based IDEs¶
Several alternative IDEs build on VS Code's foundation, offering specialized features for different workflows.
Posit Positron¶
Positron is a next-generation data science IDE developed by Posit (formerly RStudio). Built on VS Code's foundation, it provides native support for Python and R with a data-science-focused interface.
Key Features:
| Feature | Description |
|---|---|
| Multi-language Console | Switch between R and Python in the same session |
| Variables Pane | Inspect data frames, lists, and objects visually |
| Data Viewer | Explore large datasets with filtering and sorting |
| Plot Pane | View and export visualizations interactively |
| VS Code Extensions | Access the full VS Code extension marketplace |
| Quarto Integration | Native support for scientific publishing |
AI Integration:
Positron supports the same AI extensions as VS Code, plus Posit's own AI packages:
- ellmer (R) - Unified interface for 20+ LLM providers
- chatlas (Python) - LLM chat framework with streaming and tool calling
See Full Installation Instructions
For detailed Positron installation instructions, see our Posit (RStudio) guide.
Best For: Data scientists, statisticians, R programmers, and researchers who need robust data exploration tools alongside AI assistance.
Google Antigravity¶
Google Antigravity is Google's experimental AI-first code editor built on VS Code. It features deep Gemini integration for next-generation development workflows.
What Makes Antigravity Different:
- Built-in Gemini AI Chat - No extension installation required
- Agentic Coding - AI can run commands, edit files, and iterate on tasks
- Google Cloud Integration - Seamless connection to Google Cloud services
- Experimental Features - Early access to Google's latest AI capabilities
Installation¶
Apple Silicon (M1/M2/M3/M4):
- Visit antigravity.google/download
- Download the macOS ARM64 installer
- Open the downloaded
.dmgfile - Drag Antigravity to your Applications folder
- Launch from Applications or Spotlight
Intel Mac:
- Download the macOS x64 installer from the same page
- Follow the same installation steps
- Visit antigravity.google/download
- Download the Windows installer (
.exe) - Run the installer and follow the setup wizard
- Launch Antigravity from the Start menu
Using Antigravity's Built-in AI¶
Antigravity includes a built-in AI chat interface powered by Gemini:
- Open AI Chat: Press Ctrl+Shift+I (Windows/Linux) or Cmd+Shift+I (macOS)
- Start Chatting: Type your request in natural language
- Apply Suggestions: AI can directly edit your code with your approval
- Run Commands: The AI can execute terminal commands to test and verify changes
Example Workflow:
You: Create a Python script that reads a CSV file and generates a summary report
Antigravity AI: I'll create that for you. Let me:
1. Create a new file called csv_analyzer.py
2. Write the code using pandas for data analysis
3. Add error handling and documentation
[AI creates the file and shows preview]
Would you like me to run this script to test it?
Authentication:
- Sign in with your Google account when first launching Antigravity
- Gemini API usage is included (no separate API key needed for basic features)
- Google Cloud integration requires additional authentication
Antigravity vs VS Code¶
| Feature | VS Code | Antigravity |
|---|---|---|
| Base Editor | VS Code core | VS Code fork |
| AI Assistant | Extensions required | Built-in Gemini |
| Pricing | Free + extension costs | Free (Google account) |
| Agentic Features | Via extensions | Native support |
| Extension Support | Full marketplace | VS Code compatible |
| Cloud Integration | Via extensions | Native Google Cloud |
| Stability | Production-ready | Experimental |
When to Use Antigravity
Choose Antigravity if:
- You want AI built-in without extension setup
- You're already using Google Cloud services
- You want to try Google's latest AI features
- You prefer Gemini models for coding
Choose VS Code if:
- You need production stability
- You want to choose your own AI provider
- You need maximum extension compatibility
- You prefer Claude, GPT, or other models
3. AI Extensions for VS Code¶
VS Code's extension marketplace offers numerous AI coding assistants. Here are the most powerful options:
Claude Code (Anthropic)¶
Claude Code is Anthropic's official VS Code extension, bringing Claude's powerful coding abilities directly into your editor.
Key Features:
- Multi-file context awareness
- Inline code completion and suggestions
- Interactive chat panel for complex requests
- Terminal command generation and execution
- Git integration for version control assistance
- Support for Claude 4.5 Sonnet, Opus, and Haiku models
Installation:
- Open VS Code
- Press Ctrl+Shift+X (Windows/Linux) or Cmd+Shift+X (macOS) to open Extensions
- Search for "Claude Code"
- Click Install on the official Anthropic extension
Setup:
- Click the Claude icon in the Activity Bar
- Sign in with your Claude.ai account, or
- Enter your Anthropic API key (get one at console.anthropic.com)
Quick Start Example:
Press Ctrl+Shift+P → "Claude: Open Chat"
You: Explain this function and suggest improvements
[Select code in editor]
Claude: This function calculates factorial recursively. Here's my analysis:
- Time complexity: O(n)
- Space complexity: O(n) due to call stack
- Issue: No handling for negative numbers
Suggested improvements:
[Claude provides refactored code with error handling]
See Full Tutorial
For comprehensive Claude Code workflows, see our Claude Code Tutorial.
GitHub Copilot¶
GitHub Copilot is GitHub's AI pair programmer, offering real-time code suggestions as you type.
Key Features:
- Inline code completions (ghost text)
- Multi-line suggestions
- Context-aware from open files
- Support for dozens of programming languages
- GitHub Copilot Chat for conversational assistance
Pricing:
| Plan | Price | Features |
|---|---|---|
| Individual | $10/month | Code completions, chat |
| Business | $19/user/month | Team management, policy controls |
| Enterprise | $39/user/month | Advanced security, fine-tuning |
| Free for Education | $0 | Full access for verified students/educators |
See Full Setup Instructions
For detailed GitHub Copilot setup, see our GitHub Copilot guide.
Cline (formerly Claude Dev)¶
Cline is an open-source VS Code extension that pioneered the "bring your own model" (BYOM) approach, allowing you to use any AI provider.
Key Features:
- Model-agnostic: Use Claude, GPT, Gemini, or local models
- Agentic capabilities: Can run commands and modify files
- MCP (Model Context Protocol) support
- Transparent pricing: Pay per API request
- Open-source and community-driven
Installation:
- Open VS Code Extensions (Ctrl+Shift+X)
- Search for "Cline"
- Click Install
API Key Setup:
- Open Cline settings (gear icon in Cline panel)
- Select your provider:
- Get API key from console.anthropic.com
- Paste key in Cline settings
- Select model (Claude 4.5 Sonnet recommended)
- Get API key from platform.openai.com
- Paste key in Cline settings
- Select model (GPT-4o recommended)
- Get API key from aistudio.google.com
- Paste key in Cline settings
- Select model (Gemini 2.5 Pro recommended)
- Ensure Ollama is running locally
- Select "Ollama" as provider in Cline
- Choose from your downloaded models
- No API key required
Usage Example:
Open Cline panel → Type your request
You: Create a REST API endpoint for user authentication using FastAPI
Cline: I'll create that for you. Here's my plan:
1. Create auth.py with login/logout endpoints
2. Add JWT token generation
3. Create user model and validation
4. Update main.py to include the router
[Cline shows file changes for approval]
Do you want me to apply these changes?
Roo Code¶
Roo Code is a fork of Cline focused on rapid feature development and customization.
Key Features:
- All Cline features plus experimental capabilities
- Custom model presets and configurations
- Advanced prompt customization
- Frequent updates with new features
- Community-driven development
Installation:
- Open VS Code Extensions
- Search for "Roo Code" or "Roo Cline"
- Click Install
- Configure API keys same as Cline
Best For: Users who want cutting-edge features and don't mind occasional instability.
ChatGPT / CodeGPT Extensions¶
Several extensions bring OpenAI's GPT models to VS Code:
CodeGPT - Popular multi-provider extension
Installation:
- Open VS Code Extensions
- Search for "CodeGPT"
- Click Install
Setup:
- Open CodeGPT settings
- Select "OpenAI" as provider
- Enter your API key from platform.openai.com
- Select model (GPT-4o or GPT-4o-mini)
OpenAI Account Setup
For detailed OpenAI account and API setup, see our ChatGPT guide.
Google Gemini Extensions¶
Gemini CLI Companion brings Google's Gemini models to VS Code.
Installation:
- Open VS Code Extensions
- Search for "Gemini" (look for official Google extension)
- Click Install
Setup:
- Sign in with your Google account, or
- Enter API key from aistudio.google.com
Google AI Account Setup
For detailed Gemini account setup, see our Gemini guide.
4. AI Extension Comparison¶
| Extension | Provider | Pricing | Agentic | Local Models | Best For |
|---|---|---|---|---|---|
| Claude Code | Anthropic | API or subscription | Yes | No | Full-stack development, documentation |
| GitHub Copilot | GitHub/OpenAI | $10-39/month | Limited | No | Inline completions, GitHub users |
| Cline | Multi-provider | API costs only | Yes | Yes (Ollama) | Budget-conscious, model flexibility |
| Roo Code | Multi-provider | API costs only | Yes | Yes (Ollama) | Experimental features |
| CodeGPT | Multi-provider | API costs only | Limited | Yes (Ollama) | Simple setup, multi-provider |
| Gemini Companion | API or free tier | Limited | No | Google Cloud users |
Recommendation for Academic Users
For beginners: Start with GitHub Copilot (free for educators/students) for inline completions.
For research: Use Cline with Ollama for privacy-sensitive work with local models.
For serious development: Claude Code offers the best balance of capability and reliability.
5. Setting Up API Keys¶
VS Code User Settings¶
Store API keys in VS Code settings for extension-specific configuration:
- Press Ctrl+, (Windows/Linux) or Cmd+, (macOS)
- Search for your extension name
- Find the API key setting and enter your key
Security Note
API keys stored in VS Code settings are saved in plain text. For better security, use environment variables.
Environment Variables¶
The most secure way to manage API keys:
Add to your shell profile (~/.bashrc, ~/.zshrc, or ~/.bash_profile):
- Press Win+R, type
sysdm.cpl, press Enter - Click Advanced tab
- Click Environment Variables
- Under "User variables", click New
- Enter variable name (e.g.,
ANTHROPIC_API_KEY) and value
Security Best Practices¶
Best Practices:
- Use environment variables instead of hardcoding keys
- Rotate keys regularly (every 90 days recommended)
- Use separate keys for development and production
- Set spending limits in your provider dashboards
- Revoke compromised keys immediately
6. Ollama Integration (Local Models)¶
Run AI models locally for privacy, offline access, and cost savings. See our full Ollama guide for installation.
Extensions Supporting Ollama¶
| Extension | Ollama Support | Configuration |
|---|---|---|
| Cline | Native | Select "Ollama" provider |
| Roo Code | Native | Select "Ollama" provider |
| CodeGPT | Native | Select "Ollama" provider |
| Continue | Native | Add Ollama to config |
Configuration Example (Cline)¶
- Install and start Ollama
-
Pull a coding model:
-
In Cline settings:
- Provider: Ollama
- Model: Select from your downloaded models
- Base URL:
http://localhost:11434(default)
Example Workflow¶
# Terminal: Start Ollama (if not running as service)
ollama serve
# In VS Code with Cline:
# 1. Open Cline panel
# 2. Ensure Ollama is selected as provider
# 3. Start coding!
You: Explain this Python decorator and add type hints
Cline (via local Codellama): [Provides explanation and modified code]
# All processing happens locally - no data leaves your machine
Model Recommendations for Coding
| Model | Size | RAM Needed | Best For |
|---|---|---|---|
qwen2.5-coder:7b |
4.7GB | 8GB | Fast coding on laptops |
codellama:13b |
7.4GB | 16GB | Balanced coding |
deepseek-coder:33b |
19GB | 32GB | Complex coding tasks |
codestral:latest |
12GB | 24GB | Multi-language coding |
7. Practical Examples¶
Code Generation¶
Request:
AI Response (via Claude Code):
import bibtexparser
from dataclasses import dataclass, field
from typing import List, Optional
from pathlib import Path
@dataclass
class BibEntry:
"""Represents a single bibliography entry."""
entry_type: str
cite_key: str
title: str
authors: List[str]
year: int
journal: Optional[str] = None
volume: Optional[str] = None
pages: Optional[str] = None
doi: Optional[str] = None
class BibliographyManager:
"""Manage academic bibliography with BibTeX support."""
def __init__(self):
self.entries: List[BibEntry] = []
def import_bibtex(self, filepath: Path) -> int:
"""Import entries from a BibTeX file."""
with open(filepath) as f:
bib_db = bibtexparser.load(f)
for entry in bib_db.entries:
self.entries.append(BibEntry(
entry_type=entry.get('ENTRYTYPE', 'article'),
cite_key=entry.get('ID', ''),
title=entry.get('title', ''),
authors=entry.get('author', '').split(' and '),
year=int(entry.get('year', 0)),
journal=entry.get('journal'),
doi=entry.get('doi')
))
return len(bib_db.entries)
def export_bibtex(self, filepath: Path) -> None:
"""Export all entries to a BibTeX file."""
# Implementation continues...
Code Explanation¶
Select code, then ask:
AI explains:
This regex `r'^(?P<year>\d{4})-(?P<month>\d{2})-(?P<day>\d{2})$'` matches ISO date format:
- `^` - Start of string
- `(?P<year>\d{4})` - Named group "year": exactly 4 digits
- `-` - Literal hyphen
- `(?P<month>\d{2})` - Named group "month": exactly 2 digits
- `-` - Literal hyphen
- `(?P<day>\d{2})` - Named group "day": exactly 2 digits
- `$` - End of string
Suggested improvement for readability:
```python
DATE_PATTERN = re.compile(
r'^'
r'(?P<year>\d{4})' # Year: 4 digits
r'-'
r'(?P<month>\d{2})' # Month: 01-12
r'-'
r'(?P<day>\d{2})' # Day: 01-31
r'$',
re.VERBOSE
)
Debugging Assistance¶
Paste error message:
AI diagnosis:
This error occurs when you try to access an index or key on a None value.
Looking at line 45: `result = data['key'][0]`
The issue is likely that `data['key']` returns None instead of a list.
Suggested fix:
```python
# Add defensive check
value = data.get('key')
if value is not None and len(value) > 0:
result = value[0]
else:
result = default_value # or raise a more informative error
Refactoring¶
Request:
Before:
def process_files(paths):
results = []
for path in paths:
f = open(path)
data = f.read()
results.append(data.upper())
f.close()
return results
After (AI refactored):
from pathlib import Path
from typing import List
def process_files(paths: List[str]) -> List[str]:
"""
Read files and return their contents in uppercase.
Args:
paths: List of file paths to process
Returns:
List of file contents converted to uppercase
Raises:
FileNotFoundError: If any file doesn't exist
PermissionError: If any file can't be read
"""
results = []
for path in paths:
try:
content = Path(path).read_text(encoding='utf-8')
results.append(content.upper())
except (FileNotFoundError, PermissionError) as e:
raise type(e)(f"Error processing {path}: {e}") from e
return results
8. Choosing the Right Tool¶
IDE Selection Guide¶
graph TD
A[What's your primary use case?] --> B{Data Science?}
B -->|Yes| C{R or Python?}
B -->|No| D{Want built-in AI?}
C -->|R focused| E[Positron]
C -->|Python focused| F{Need data viewers?}
F -->|Yes| E
F -->|No| G[VS Code + Extensions]
D -->|Yes| H{Google Cloud user?}
D -->|No| G
H -->|Yes| I[Antigravity]
H -->|No| J{Prefer Gemini?}
J -->|Yes| I
J -->|No| G
Extension Selection Guide¶
| If you need... | Use this... | Why |
|---|---|---|
| Best inline completions | GitHub Copilot | Industry standard, fast, reliable |
| Strongest reasoning | Claude Code | Best for complex tasks, documentation |
| Maximum flexibility | Cline | Use any model, including local |
| Privacy/offline | Cline + Ollama | Everything runs locally |
| Google integration | Gemini extensions | Native Google Cloud support |
| Free option | Copilot (edu) or Cline + Ollama | No cost for students/educators |
| Experimental features | Roo Code | Cutting-edge capabilities |
Combining Multiple Extensions¶
You can install multiple AI extensions and use them for different tasks:
- GitHub Copilot - Always-on inline completions
- Claude Code - Complex refactoring and documentation
- Cline + Ollama - Privacy-sensitive or offline work
Avoiding Conflicts
If you have multiple AI extensions:
- Disable inline completions in all but one extension
- Use keyboard shortcuts to invoke specific assistants
- Check for conflicting keybindings in VS Code settings
9. Troubleshooting¶
Common Issues¶
Extension won't authenticate
Symptoms: "Invalid API key" or "Authentication failed" errors
Solutions:
- Verify your API key is correct (no extra spaces)
- Check if your API key has expired
- Ensure you have sufficient credits/quota
- Try regenerating a new API key
- Check firewall/proxy settings aren't blocking requests
Slow or no responses
Symptoms: Long wait times or timeouts
Solutions:
- Check your internet connection
- Verify the AI service isn't experiencing outages
- Try a different/faster model
- For Ollama: ensure you have enough RAM for the model
- Reduce context size (close unnecessary files)
Code suggestions are wrong or outdated
Symptoms: AI suggests deprecated APIs or incorrect syntax
Solutions:
- Provide more context in your prompts
- Specify the language version explicitly
- Include relevant documentation links
- Use a more capable model
- Update the extension to latest version
Ollama models won't load
Symptoms: "Model not found" or memory errors
Solutions:
- Verify model is downloaded:
ollama list - Check available RAM vs model requirements
- Try a smaller model
- Restart Ollama service
- Check Ollama logs:
ollama logs
Extension conflicts
Symptoms: Multiple completions, keyboard shortcut issues
Solutions:
- Disable inline suggestions in all but one extension
- Check keybindings: Ctrl+K Ctrl+S
- Use Command Palette to invoke specific extensions
- Update all extensions to latest versions
10. Further Resources¶
Official Documentation¶
Workshop Guides¶
| Topic | Guide |
|---|---|
| Claude Code workflows | Claude Code Tutorial |
| GitHub Copilot setup | GitHub Copilot Guide |
| OpenAI account setup | ChatGPT Guide |
| Google AI setup | Gemini Guide |
| Posit tools and Positron | Posit Guide |
| Local AI with Ollama | Ollama Guide |
| Vibe coding overview | Vibe Coding |
Community Resources¶
Getting Started Checklist
- Install VS Code (or Positron/Antigravity)
- Choose and install an AI extension
- Set up API keys securely
- Test with a simple coding request
- Explore additional extensions as needed
- Consider local models for privacy-sensitive work