By the end of this tutorial, you’ll have a fully functional Bitbucket Pipelines setup that automatically runs tests on every commit. In just 45 minutes, you’ll configure a CI/CD pipeline that catches bugs before they reach production, complete with parallel testing, caching, and deployment strategies.
What You’ll Build
You’ll create a production-ready testing pipeline for your Bitbucket repository that:
- Automatically runs unit, integration, and E2E tests on every push
- Executes tests in parallel to reduce build time by 60%
- Caches dependencies to speed up builds
- Generates test coverage reports
- Deploys automatically when all tests pass
This setup is used by companies like Atlassian, Docker, and thousands of development teams worldwide to maintain code quality and ship faster.
Learning Objectives
In this tutorial, you’ll learn:
- How to configure
bitbucket-pipelines.ymlfor automated testing - How to set up parallel test execution for faster feedback
- How to implement caching strategies to optimize build times
- How to integrate test coverage reporting and quality gates
- How to handle different environments (staging, production)
- How to troubleshoot common pipeline issues
Time Estimate: 45-60 minutes
Prerequisites
Required Software
Before starting, install:
| Tool | Version | Purpose |
|---|---|---|
| Git | 2.0+ | Version control |
| Node.js | 18+ | Runtime for example project |
| npm | 9+ | Package management |
| Docker | 20+ | Container runtime (optional) |
Installation:
# macOS
brew install git node
# Linux (Ubuntu/Debian)
apt-get install git nodejs npm
# Windows
choco install git nodejs
Required Knowledge
You should be familiar with:
- ✅ Basic Git workflows (commit, push, pull)
- ✅ Command line basics
- ✅ Basic understanding of CI/CD concepts
- ❌ Not required: Advanced Docker or Kubernetes knowledge
Required Resources
- Bitbucket account (free tier works)
- A repository with tests (or use our sample project)
- Write access to repository settings
- 15 minutes for initial pipeline execution
Step 1: Project Setup and Configuration
In this step, we’ll create the basic pipeline configuration file and understand its structure.
Create Pipeline Configuration
Navigate to your repository root and create the configuration file:
cd your-project
touch bitbucket-pipelines.yml
Add the basic pipeline structure:
# bitbucket-pipelines.yml
image: node:18
pipelines:
default:
- step:
name: Build and Test
caches:
- node
script:
- npm install
- npm test
What this does:
image: node:18- Specifies the Docker image for the build environmentcaches: - node- Caches node_modules to speed up subsequent buildsscript- Commands executed in sequence
Commit and Push
git add bitbucket-pipelines.yml
git commit -m "Add Bitbucket Pipelines configuration"
git push origin main
You should see:
- Bitbucket automatically detects the configuration
- First pipeline execution starts within 30 seconds
- Pipeline appears in Bitbucket UI under “Pipelines” tab
💡 Pro Tip: Enable Pipelines in Repository Settings → Pipelines → Settings if not already enabled.
Verify Setup
Navigate to your repository in Bitbucket and click the “Pipelines” tab.
Expected result:
✅ Pipeline #1 (in progress)
└── Build and Test (running)
✅ Checkpoint: You should now have a basic pipeline running. Even if tests fail, the pipeline should execute.
Step 2: Configuring Multiple Test Types
Real-world projects need different types of tests. Let’s configure unit, integration, and linting in parallel.
Update Pipeline Configuration
Modify your bitbucket-pipelines.yml:
image: node:18
definitions:
caches:
npm: ~/.npm
pipelines:
default:
- parallel:
- step:
name: Unit Tests
caches:
- node
- npm
script:
- npm ci
- npm run test:unit
artifacts:
- coverage/**
- step:
name: Linting
caches:
- node
script:
- npm ci
- npm run lint
- step:
name: Integration Tests
caches:
- node
services:
- docker
script:
- npm ci
- npm run test:integration
What changed:
parallel:- Runs multiple steps simultaneously (60% faster)npm ci- Clean install, faster and more reliable thannpm installartifacts- Saves coverage reports for later stepsservices: - docker- Starts Docker daemon for integration tests
Update package.json Scripts
Ensure your package.json has the test scripts:
{
"scripts": {
"test:unit": "jest --coverage --testPathPattern=unit",
"test:integration": "jest --testPathPattern=integration",
"lint": "eslint . --ext .js,.jsx,.ts,.tsx"
}
}
Common Issues ⚠️
Problem: “npm run test:unit not found”
Solution: Add the missing script to package.json or update the pipeline command to match your actual test command:
script:
- npm ci
- npm test # Use your actual test command
Problem: Out of memory errors during tests
Solution: Increase memory allocation:
- step:
name: Unit Tests
size: 2x # Double memory (4GB instead of 2GB)
Verify This Step
Push your changes and monitor the pipeline:
git add bitbucket-pipelines.yml package.json
git commit -m "Add parallel test execution"
git push
Expected result:
✅ Pipeline #2 (in progress)
├── Unit Tests (running) [30s]
├── Linting (running) [15s]
└── Integration Tests (running) [45s]
✅ Checkpoint: All three test steps should run simultaneously, reducing total execution time.
Step 3: Implementing Advanced Caching
Caching dramatically reduces build times. Let’s implement multi-layer caching.
Enhanced Caching Strategy
Update the pipeline with advanced caching:
image: node:18
definitions:
caches:
npm: ~/.npm
cypress: ~/.cache/Cypress
jest: .jest-cache
pipelines:
default:
- parallel:
- step:
name: Unit Tests
caches:
- node
- npm
- jest
script:
- npm ci --cache ~/.npm --prefer-offline
- npm run test:unit -- --cache --cacheDirectory=.jest-cache
artifacts:
- coverage/**
- test-results/**
Caching benefits:
- First build: ~120 seconds (no cache)
- Subsequent builds: ~35 seconds (with cache)
- Savings: 70% faster builds
💡 Pro Tip: Cache Docker layers for even faster builds:
- step:
name: Build Docker Image
caches:
- docker
script:
- docker build -t myapp:$BITBUCKET_COMMIT .
Verify Caching Works
Check pipeline logs for cache indicators:
Restoring caches...
✓ node: Restored successfully (142.3 MB)
✓ npm: Restored successfully (89.1 MB)
✅ Checkpoint: Second build should be 50-70% faster than the first.
Step 4: Adding Test Coverage and Quality Gates
Ensure code quality by adding coverage thresholds.
Configure Coverage Reporting
Add coverage configuration to jest.config.js:
// jest.config.js
module.exports = {
collectCoverage: true,
coverageDirectory: 'coverage',
coverageReporters: ['text', 'lcov', 'html'],
coverageThreshold: {
global: {
branches: 80,
functions: 80,
lines: 80,
statements: 80
}
}
};
Update Pipeline with Quality Gates
- step:
name: Unit Tests with Coverage
caches:
- node
script:
- npm ci
- npm run test:unit
- |
if [ -f coverage/lcov.info ]; then
echo "✅ Coverage report generated"
else
echo "❌ Coverage report missing"
exit 1
fi
artifacts:
- coverage/**
What this does:
- Generates coverage reports in multiple formats
- Fails the build if coverage falls below 80%
- Saves coverage artifacts for later review
Integration with Code Coverage Tools
Add Codecov integration:
- step:
name: Upload Coverage
script:
- pipe: codecov/codecov-upload:1.3.2
variables:
CODECOV_TOKEN: $CODECOV_TOKEN
Expected output:
✅ Coverage: 87.3% (+2.1%)
✅ All thresholds passed
✅ Uploaded to Codecov
✅ Checkpoint: Build should fail if coverage drops below 80%.
Step 5: Environment-Specific Pipelines
Configure different behaviors for branches and environments.
Branch-Specific Pipelines
pipelines:
default:
- step:
name: Quick Tests
script:
- npm ci
- npm run test:unit
branches:
main:
- step:
name: Full Test Suite
script:
- npm ci
- npm run test:unit
- npm run test:integration
- npm run test:e2e
- step:
name: Deploy to Staging
deployment: staging
script:
- npm run deploy:staging
develop:
- step:
name: Development Tests
script:
- npm ci
- npm run test:unit
- npm run lint
pull-requests:
'**':
- step:
name: PR Validation
script:
- npm ci
- npm run test:unit
- npm run lint
Pipeline behavior:
- Feature branches: Fast unit tests only (2 minutes)
- Pull requests: Unit tests + linting (3 minutes)
- Develop branch: Development tests (4 minutes)
- Main branch: Full suite + deployment (8 minutes)
Environment Variables
Add secrets in Repository Settings → Pipelines → Repository variables:
- step:
name: Integration Tests
script:
- export DATABASE_URL=$DATABASE_URL_TEST
- export API_KEY=$API_KEY_TEST
- npm run test:integration
💡 Pro Tip: Use deployment variables for environment-specific configs:
- step:
name: Deploy to Production
deployment: production
script:
- echo "Deploying with API_KEY: ${API_KEY}"
- npm run deploy
✅ Checkpoint: Different branches should trigger different pipeline configurations.
Step 6: Handling Deployment
Deploy automatically when tests pass.
Conditional Deployment
- step:
name: Deploy to Production
deployment: production
trigger: manual
script:
- pipe: atlassian/aws-s3-deploy:1.1.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: 'my-production-bucket'
LOCAL_PATH: 'dist'
Deployment options:
trigger: manual- Requires manual approvaltrigger: automatic- Deploys automatically when tests pass
Deployment with Rollback
- step:
name: Deploy with Rollback
script:
- export PREVIOUS_VERSION=$(git describe --tags --abbrev=0)
- npm run deploy
- npm run healthcheck || (echo "Health check failed, rolling back..." && git checkout $PREVIOUS_VERSION && npm run deploy && exit 1)
✅ Checkpoint: Deployment should only occur after all tests pass.
Testing Your Implementation
Now let’s verify everything works correctly.
Manual Testing
Test Case 1: Feature Branch Push
Create and push a feature branch:
git checkout -b feature/test-pipeline git push origin feature/test-pipelineExpected result:
✅ Quick Tests passed (2m 15s)Test Case 2: Pull Request
Create a PR in Bitbucket UI.
Expected result:
✅ PR Validation passed (3m 30s) ├── Unit Tests ✅ └── Linting ✅Test Case 3: Main Branch Deployment
Merge to main:
git checkout main git merge feature/test-pipeline git push origin mainExpected result:
✅ Full Test Suite passed (7m 45s) ⏸️ Deploy to Staging (manual trigger available)
Automated Health Check Script
Create a validation script:
#!/bin/bash
# validate-pipeline.sh
echo "🔍 Validating Bitbucket Pipeline Configuration..."
# Check if pipeline file exists
if [ ! -f "bitbucket-pipelines.yml" ]; then
echo "❌ bitbucket-pipelines.yml not found"
exit 1
fi
echo "✅ Pipeline configuration exists"
# Validate YAML syntax
if command -v yamllint &> /dev/null; then
yamllint bitbucket-pipelines.yml || exit 1
echo "✅ YAML syntax valid"
fi
# Check for required scripts in package.json
required_scripts=("test:unit" "lint")
for script in "${required_scripts[@]}"; do
if grep -q "\"$script\"" package.json; then
echo "✅ Script '$script' found"
else
echo "❌ Script '$script' missing"
exit 1
fi
done
echo "✅ All validations passed! 🎉"
Run it:
chmod +x validate-pipeline.sh
./validate-pipeline.sh
Validation Checklist
- Pipeline runs on every push
- Parallel steps execute simultaneously
- Caching reduces subsequent build times
- Coverage reports generated
- Branch-specific behavior works
- Manual deployment requires approval
- Failed tests block deployment
Troubleshooting
Issue 1: Pipeline Not Triggering
Symptoms:
- No pipeline executes after push
- “Pipelines” tab shows no history
Possible Causes:
- Pipelines not enabled in repository settings
- Invalid YAML syntax in configuration file
Solution:
# Validate YAML syntax locally
docker run --rm -v $(pwd):/work mikefarah/yq eval bitbucket-pipelines.yml
# Enable pipelines in repository settings
# Navigate to: Repository Settings → Pipelines → Settings
# Toggle "Enable Pipelines" ON
How to verify it’s fixed:
git commit --allow-empty -m "Trigger pipeline"
git push
# Check Pipelines tab within 30 seconds
Issue 2: Out of Memory Errors
Error message:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
What it means: Your tests are consuming more than the default 2GB memory limit.
Quick fix:
- step:
name: Memory-Intensive Tests
size: 2x # Increases to 4GB RAM
script:
- npm ci
- NODE_OPTIONS="--max-old-space-size=3072" npm test
Detailed fix:
Profile your tests to find memory leaks:
node --inspect-brk node_modules/.bin/jest --runInBandOptimize test setup:
// jest.config.js module.exports = { maxWorkers: 2, // Limit parallel workers clearMocks: true, resetMocks: true, };
Issue 3: Slow Build Times
If builds take longer than 5 minutes:
Enable parallel execution:
- parallel: - step: name: Fast Tests - step: name: More TestsOptimize caching:
definitions: caches: custom-cache: path/to/cacheUse npm ci instead of npm install:
npm ci # 40% faster than npm installMonitor build metrics:
# Add to pipeline - time npm ci - time npm test
Issue 4: Docker Service Connection Failures
Symptoms:
- Integration tests fail with “Cannot connect to Docker daemon”
Solution:
- step:
name: Integration Tests
services:
- docker
script:
- sleep 10 # Wait for Docker to start
- docker ps # Verify Docker is running
- npm run test:integration
Still Having Issues?
- Check the Bitbucket Pipelines documentation
- Review the pipeline validator
- Ask on Atlassian Community
- Check our CI/CD troubleshooting guide for common patterns
Next Steps
Congratulations! You’ve successfully set up automated testing in Bitbucket Pipelines. 🎉
What You’ve Built
You now have:
- ✅ Automated testing on every commit
- ✅ Parallel test execution (60% faster builds)
- ✅ Multi-layer caching strategy
- ✅ Test coverage reporting with quality gates
- ✅ Branch-specific pipeline behavior
- ✅ Manual deployment controls
- ✅ Production-ready CI/CD pipeline
Level Up Your Skills
Ready for more? Try these enhancements:
Easy Enhancements (30 min each)
Add Slack Notifications
- step: name: Notify on Failure script: - pipe: atlassian/slack-notify:2.1.0 variables: WEBHOOK_URL: $SLACK_WEBHOOKAdd Security Scanning
- step: name: Security Audit script: - npm audit --audit-level=moderate
Intermediate Enhancements (1-2 hours each)
Add Performance Testing
- Integrate Lighthouse CI
- Set performance budgets
- Fail builds on regression
Implement Blue-Green Deployment
- Deploy to staging environment
- Run smoke tests
- Swap to production
Advanced Enhancements (3+ hours)
- Multi-Cloud Deployment
- Deploy to AWS, Azure, GCP simultaneously
- Geographic routing
- Failover strategies
Related Tutorials
Continue learning:
- Test Parallelization in CI/CD - Advanced parallelization strategies
- Docker in CI/CD Best Practices - Optimize container-based pipelines
- Monitoring CI/CD Performance - Track and improve pipeline metrics
Share Your Results
Built an impressive pipeline? Share it:
- Tweet your setup with #BitbucketPipelines
- Write about your experience on your blog
- Contribute improvements to open-source projects
Conclusion
What You Accomplished
In this tutorial, you:
- ✅ Created a complete Bitbucket Pipelines configuration
- ✅ Implemented parallel test execution for faster feedback
- ✅ Configured multi-layer caching to optimize build times
- ✅ Added test coverage reporting and quality gates
- ✅ Set up branch-specific pipeline behaviors
- ✅ Configured conditional deployment with manual controls
Key Takeaways
- Parallel execution can reduce build times by 50-70%
- Caching strategies are critical for fast feedback loops
- Quality gates prevent low-quality code from reaching production
- Branch-specific configs balance speed and thoroughness
- Manual deployment triggers provide safety for production releases
Keep Learning
This is just the beginning! Check out:
Questions or feedback? Share your pipeline setup in the comments below!
Found this helpful? Share it with your team to improve your CI/CD workflow!