Thank you for your interest in contributing to the CosmoTech Acceleration Library! This document provides guidelines and instructions for contributing to this project.
- Code of Conduct
- Getting Started
- Development Workflow
- Testing Requirements
- Documentation Guidelines
- Pull Request Process
- Style Guide
We are committed to providing a friendly, safe, and welcoming environment for all contributors. We expect everyone to be respectful and considerate of others.
- Fork the repository on GitHub
- Clone your fork locally
- Set up the development environment:
pip install -e ".[dev]" pre-commit install
- Create a new branch for your feature or bugfix
- Make your changes
- Run tests and ensure code coverage is maintained or improved
- Update documentation as needed
- Submit a pull request
All contributions to the cosmotech.coal module must include appropriate test coverage. This is a strict requirement to maintain code quality and reliability.
- Write unit tests for all new functionality
- Ensure existing tests pass with your changes
- Maintain or improve the current code coverage percentage
- Use mocking for external services to ensure tests are reliable and fast
# Run tests with coverage reporting
pytest tests/unit/coal/ --cov=cosmotech.coal --cov-report=term-missing --cov-report=html- Place tests in the appropriate subdirectory under
tests/unit/coal/ - Follow the naming convention
test_module_file.pyto ensure unique test file names - Use fixtures from
conftest.pywhere appropriate - Mock external dependencies to ensure tests are isolated
To help maintain test coverage, we provide tools to identify untested functions and generate test files:
# Find functions without tests
python find_untested_functions.py
# Generate test files for a specific module
python generate_test_files.py --module cosmotech/coal/module/file.py
# Generate test files for all untested functions
python generate_test_files.py --allThese tools help ensure that every function has at least one test. When using the generated test files:
- Verify that the functions actually exist in the module (the generator tries to check this, but may miss some cases)
- Implement the test logic by replacing the
passstatements with actual test code - Use mocking for external dependencies to ensure tests are isolated
- New code should aim for at least 80% coverage
- Critical components should have close to 100% coverage
- Use
# pragma: no coversparingly and only for code that genuinely cannot be tested - Every function must have at least one test - this is a strict requirement to ensure basic functionality is tested
All new features must be documented. This includes:
- Docstrings: All public functions, classes, and methods must have clear docstrings following the existing format
- Examples: Include usage examples where appropriate
- Tutorials: For significant features, consider adding a tutorial in the
tutorial/directory - API Documentation: Update API documentation if your changes affect the public API
- Ensure all tests pass and coverage requirements are met
- Update documentation as needed
- Write a clear and descriptive pull request description that:
- Explains the purpose of the changes
- Describes how the changes address the issue
- Lists any dependencies that were added or modified
- Mentions any breaking changes
- Reference any related issues using the GitHub issue reference syntax
- Wait for code review and address any feedback
- Follow the existing code style (we use Black for formatting)
- Run pre-commit hooks before committing to ensure style consistency
- Use meaningful variable and function names
- Keep functions focused on a single responsibility
- Write clear comments for complex logic
Write clear, concise commit messages that explain the "why" behind changes. Follow this format:
[Component] Short summary of changes (50 chars or less)
More detailed explanation if needed. Wrap lines at 72 characters.
Explain the problem that this commit is solving and why you're solving
it this way.
Fixes #123
When writing tests for code that interacts with external services (AWS, Azure, CosmoTech API, etc.), always use mocks to ensure tests are:
- Fast: Tests should run quickly without waiting for external services
- Reliable: Tests should not fail due to network issues or service unavailability
- Isolated: Tests should not depend on external state or configuration
- Repeatable: Tests should produce the same results every time they run
Example of mocking an external service:
@patch('boto3.client')
def test_s3_upload(mock_client):
# Set up the mock
mock_s3 = MagicMock()
mock_client.return_value = mock_s3
# Test the function
result = upload_to_s3('file.txt', 'bucket-name')
# Verify the mock was called correctly
mock_s3.upload_file.assert_called_once_with('file.txt', 'bucket-name', 'file.txt')Thank you for contributing to the CosmoTech Acceleration Library!