diff options
author | Reinier van der Leer <pwuts@agpt.co> | 2023-12-07 14:46:08 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2023-12-07 14:46:08 +0100 |
commit | 1f40d720810c5eae43ec3062741c22b9a9cd821e (patch) | |
tree | 061d602e3c17da6f94b353c6e086e0c2addd75e7 /.github | |
parent | Update import path in agent_protocol.py (#6512) (diff) | |
download | Auto-GPT-1f40d720810c5eae43ec3062741c22b9a9cd821e.tar.gz Auto-GPT-1f40d720810c5eae43ec3062741c22b9a9cd821e.tar.bz2 Auto-GPT-1f40d720810c5eae43ec3062741c22b9a9cd821e.zip |
feat(agent/workspace): Add GCS and S3 FileWorkspace providers (#6485)
* refactor: Rename FileWorkspace to LocalFileWorkspace and create FileWorkspace abstract class
- Rename `FileWorkspace` to `LocalFileWorkspace` to provide a more descriptive name for the class that represents a file workspace that works with local files.
- Create a new base class `FileWorkspace` to serve as the parent class for `LocalFileWorkspace`. This allows for easier extension and customization of file workspaces in the future.
- Update import statements and references to `FileWorkspace` throughout the codebase to use the new naming conventions.
* feat: Add S3FileWorkspace + tests + test setups for CI and Docker
- Added S3FileWorkspace class to provide an interface for interacting with a file workspace and storing files in an S3 bucket.
- Updated pyproject.toml to include dependencies for boto3 and boto3-stubs.
- Implemented unit tests for S3FileWorkspace.
- Added MinIO service to Docker CI to allow testing S3 features in CI.
- Added autogpt-test service config to docker-compose.yml for local testing with MinIO.
* ci(docker): tee test output instead of capturing
* fix: Improve error handling in S3FileWorkspace.initialize()
- Do not tolerate all `botocore.exceptions.ClientError`s
- Raise the exception anyways if the error is not "NoSuchBucket"
* feat: Add S3 workspace backend support and S3Credentials
- Added support for S3 workspace backend in the Autogpt configuration
- Added a new sub-config `S3Credentials` to store S3 credentials
- Modified the `.env.template` file to include variables related to S3 credentials
- Added a new `s3_credentials` attribute on the `Config` class to store S3 credentials
- Moved the `unmasked` method from `ModelProviderCredentials` to the parent `ProviderCredentials` class to handle unmasking for S3 credentials
* fix(agent/tests): Fix S3FileWorkspace initialization in test_s3_file_workspace.py
- Update the S3FileWorkspace initialization in the test_s3_file_workspace.py file to include the required S3 Credentials.
* refactor: Remove S3Credentials and add get_workspace function
- Remove `S3Credentials` as boto3 will fetch the config from the environment by itself
- Add `get_workspace` function in `autogpt.file_workspace` module
- Update `.env.template` and tests to reflect the changes
* feat(agent/workspace): Make agent workspace backend configurable
- Modified `autogpt.file_workspace.get_workspace` function to either take a workspace `id` or `root_path`.
- Modified `FileWorkspaceMixin` to use the `get_workspace` function to set up the workspace.
- Updated the type hints and imports accordingly.
* feat(agent/workspace): Add GCSFileWorkspace for Google Cloud Storage
- Added support for Google Cloud Storage as a storage backend option in the workspace.
- Created the `GCSFileWorkspace` class to interface with a file workspace stored in a Google Cloud Storage bucket.
- Implemented the `GCSFileWorkspaceConfiguration` class to handle the configuration for Google Cloud Storage workspaces.
- Updated the `get_workspace` function to include the option to use Google Cloud Storage as a workspace backend.
- Added unit tests for the new `GCSFileWorkspace` class.
* fix: Unbreak use of non-local workspaces in AgentProtocolServer
- Modify the `_get_task_agent_file_workspace` method to handle both local and non-local workspaces correctly
Diffstat (limited to '.github')
-rw-r--r-- | .github/workflows/autogpt-ci.yml | 14 | ||||
-rw-r--r-- | .github/workflows/autogpt-docker-ci.yml | 31 |
2 files changed, 34 insertions, 11 deletions
diff --git a/.github/workflows/autogpt-ci.yml b/.github/workflows/autogpt-ci.yml index 9005d1607..2ce756a7a 100644 --- a/.github/workflows/autogpt-ci.yml +++ b/.github/workflows/autogpt-ci.yml @@ -83,6 +83,15 @@ jobs: matrix: python-version: ["3.10"] + services: + minio: + image: minio/minio:edge-cicd + ports: + - 9000:9000 + options: > + --health-interval=10s --health-timeout=5s --health-retries=3 + --health-cmd="curl -f http://localhost:9000/minio/health/live" + steps: - name: Checkout repository uses: actions/checkout@v3 @@ -154,8 +163,11 @@ jobs: tests/unit tests/integration env: CI: true - OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} PLAIN_OUTPUT: True + OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} + S3_ENDPOINT_URL: http://localhost:9000 + AWS_ACCESS_KEY_ID: minioadmin + AWS_SECRET_ACCESS_KEY: minioadmin - name: Upload coverage reports to Codecov uses: codecov/codecov-action@v3 diff --git a/.github/workflows/autogpt-docker-ci.yml b/.github/workflows/autogpt-docker-ci.yml index 4588f2869..dc555c381 100644 --- a/.github/workflows/autogpt-docker-ci.yml +++ b/.github/workflows/autogpt-docker-ci.yml @@ -89,6 +89,15 @@ jobs: test: runs-on: ubuntu-latest timeout-minutes: 10 + + services: + minio: + image: minio/minio:edge-cicd + options: > + --name=minio + --health-interval=10s --health-timeout=5s --health-retries=3 + --health-cmd="curl -f http://localhost:9000/minio/health/live" + steps: - name: Check out repository uses: actions/checkout@v3 @@ -124,23 +133,25 @@ jobs: CI: true PLAIN_OUTPUT: True OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} + S3_ENDPOINT_URL: http://minio:9000 + AWS_ACCESS_KEY_ID: minioadmin + AWS_SECRET_ACCESS_KEY: minioadmin run: | set +e - test_output=$( - docker run --env CI --env OPENAI_API_KEY \ - --entrypoint poetry ${{ env.IMAGE_NAME }} run \ - pytest -v --cov=autogpt --cov-branch --cov-report term-missing \ - --numprocesses=4 --durations=10 \ - tests/unit tests/integration 2>&1 - ) - test_failure=$? + docker run --env CI --env OPENAI_API_KEY \ + --network container:minio \ + --env S3_ENDPOINT_URL --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY \ + --entrypoint poetry ${{ env.IMAGE_NAME }} run \ + pytest -v --cov=autogpt --cov-branch --cov-report term-missing \ + --numprocesses=4 --durations=10 \ + tests/unit tests/integration 2>&1 | tee test_output.txt - echo "$test_output" + test_failure=${PIPESTATUS[0]} cat << $EOF >> $GITHUB_STEP_SUMMARY # Tests $([ $test_failure = 0 ] && echo '✅' || echo '❌') \`\`\` - $test_output + $(cat test_output.txt) \`\`\` $EOF |