From f7aba96802a629d2829fc09606c67a07364c3016 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jean=20Carlos=20Alarc=C3=B3n?= <56373098+jcalarcon98@users.noreply.github.com> Date: Wed, 4 Aug 2021 11:53:29 -0500 Subject: [PATCH 1/4] fix: TT-302 Fix URLLIB3 dependencies vulnerabilities (#313) --- requirements/azure_cosmos.txt | 2 +- requirements/commons.txt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/requirements/azure_cosmos.txt b/requirements/azure_cosmos.txt index f4d95df0..62ae1c17 100644 --- a/requirements/azure_cosmos.txt +++ b/requirements/azure_cosmos.txt @@ -9,7 +9,7 @@ certifi==2019.11.28 chardet==3.0.4 idna==2.8 six==1.13.0 -urllib3==1.25.8 +urllib3==1.26.5 virtualenv==16.7.9 virtualenv-clone==0.5.3 diff --git a/requirements/commons.txt b/requirements/commons.txt index 9b5d811c..aef1f707 100644 --- a/requirements/commons.txt +++ b/requirements/commons.txt @@ -3,7 +3,7 @@ # For Common dependencies # Handling requests -requests==2.23.0 +requests==2.25.1 # To create sample content in tests and API documentation Faker==4.0.2 From 51d9adf834d782439016eb68ef9ead0f11d57e6c Mon Sep 17 00:00:00 2001 From: jcalarcon98 Date: Fri, 30 Jul 2021 11:32:41 -0500 Subject: [PATCH 2/4] docs: TT-301 Update readme documentation and add Time Tracker CLI docs --- README.md | 353 +++++++----------- cosmosdb_emulator/README.md | 88 +++++ requirements/time_tracker_api/dev.txt | 3 - .../data_access_layer/cosmos_db_test.py | 70 ++-- time-tracker.sh | 11 + 5 files changed, 266 insertions(+), 259 deletions(-) create mode 100644 cosmosdb_emulator/README.md create mode 100644 time-tracker.sh diff --git a/README.md b/README.md index 63029102..d52e23b6 100644 --- a/README.md +++ b/README.md @@ -10,135 +10,183 @@ Follow the following instructions to get the project ready to use ASAP. ### Requirements -Be sure you have installed in your system +Be sure you have installed in your system: - [Python version 3](https://www.python.org/download/releases/3.0/) (recommended 3.8 or less) in your path. It will install automatically [pip](https://pip.pypa.io/en/stable/) as well. -- A virtual environment, namely [venv](https://docs.python.org/3/library/venv.html). +- A virtual environment, namely [.venv](https://docs.python.org/3/library/venv.html). - Optionally for running Azure functions locally: [Azure functions core tool](https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=macos%2Ccsharp%2Cbash). +- Docker + + You can follow the instructions below to install on each of the following operating systems: + - [**Mac**](https://docs.docker.com/docker-for-mac/install/) + - [**Linux**](https://docs.docker.com/engine/install/) + - [**Windows**](https://docs.docker.com/docker-for-windows/install/) + +- Docker Compose + + To install Docker Compose, please choose the operating system you use and follow the steps [here](https://docs.docker.com/compose/install/). + ### Setup -- Create and activate the environment, +Once installed Docker and Docker Compose we must create a `.env` file in the root of our project where we will put the following environment variables. - In Windows: +```shell +export MS_AUTHORITY=XXXX +export MS_CLIENT_ID=XXXX +export MS_SCOPE=XXXX +export MS_SECRET=yFo=XXXX +export MS_ENDPOINT=XXXX +export DATABASE_ACCOUNT_URI=XXXX +export DATABASE_MASTER_KEY=XXXX +export DATABASE_NAME=XXXX +export FLASK_APP=XXXX +export AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXXX +export FLASK_DEBUG=XXXX +export REQUESTS_CA_BUNDLE=XXXX +``` +> **Please, contact the project development team for the content of the variables mentioned above.** - ``` - #Create virtual enviroment - python -m venv .venv +### Run containers - #Execute virtual enviroment - .venv\Scripts\activate.bat - ``` +Once all the project configuration is done, we are going to execute the following command in the terminal, taking into account that we are inside the root folder of the project: - In Unix based operative systems: +```shell +docker-compose up --build +``` - ``` - #Create virtual enviroment - virtualenv .venv +This command will build all images with the necessary configurations for each one, aslo +raises the cosmos emulator in combination with the backend, now you can open in the browser: - #Execute virtual enviroment - source .venv/bin/activate - ``` +- `http://127.0.0.1:5000/` open backend API. +- `https://127.0.0.1:8081/_explorer/index.html` to open Cosmos DB emulator. -**Note:** If you're a linux user you will need to install an additional dependency to have it working. +> If you have already executed the command (`docker-compose up --build`) previously in this project, +> it is not necessary to execute it again, instead it should be executed like this: +> `docker-compose up` -Type in the terminal the following command to install the required dependency to have pyodbc working locally: +> It is also important to clarify that if packages or any extra configuration is added to the images construction, +> you need to run again `docker-compose up --build`, you can see more information about this flag [here](https://docs.docker.com/compose/reference/up/) -```sh -sudo apt-get install unixodbc-dev -``` +## Development + +### Generate Fake Data + +In order to generate fake data to test functionalities or correct errors, +we have built a CLI, called 'Time Tracker CLI', which is in charge of generating +the fake information inside the Cosmos emulator. -- Install the requirements: +To learn how this CLI works, you can see the instructions [here](https://github.com/ioet/time-tracker-backend/tree/master/cosmosdb_emulator) - ``` - python3 -m pip install -r requirements//.txt - ``` +### Git hooks - If you use Windows, you will use this comand: +We use [pre-commit](https://github.com/pre-commit/pre-commit) library to manage local git hooks, +as developers we just need to run in our virtual environment. - ``` - python -m pip install -r requirements//.txt - ``` +This library allows you to execute code right before the commit, for example: +- Check if the commit contains the correct formatting. +- Format modified files based on a Style Guide such as PEP 8, etc. - Where `` is one of the executable app namespace, e.g. `time_tracker_api` or `time_tracker_events` (**Note:** Currently, only `time_tracker_api` is used.). The `stage` can be +To install and use `pre-commit` we have to perform the following steps: - - `dev`: Used for working locally - - `prod`: For anything deployed +**Create the environment** -Remember to do it with Python 3. +Execute the next command at the root of the project: -Bear in mind that the requirements for `time_tracker_events`, must be located on its local requirements.txt, by -[convention](https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-python#folder-structure). +```shell +python -m venv .venv +``` -- Run `pre-commit install -t pre-commit -t commit-msg`. For more details, see section Development > Git hooks. +> **Note:** We can replace python for python3 or python3.8 according to the version you have installed, +> but do not forget the initial requirements. -### Set environment variables +**Activate the environment** -Set environment variables with the content pinned in our slack channel #time-tracker-developer: +Windows: +```shell +.venv\Scripts\activate.bat +``` -When you use Bash or GitBash you should use: +In Unix based operative systems: +```shell +source .venv/bin/activate ``` -export MS_AUTHORITY=XXX -export MS_CLIENT_ID=XXX -export MS_SCOPE=XXX -export MS_SECRET=XXX -export MS_ENDPOINT=XXX -export DATABASE_ACCOUNT_URI=XXX -export DATABASE_MASTER_KEY=XXX -export DATABASE_NAME=XXX -export FLASK_APP=XXX -export AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXX -export FLASK_DEBUG=True + +Once the environment has been created and activated we have to run: +```shell +python3 -m pip install pre-commit ``` -If you use PowerShell, you should use: +Once `pre-commit` library is installed, we are going to execute the following command: + +```shell +pre-commit install -t pre-commit -t commit-msg +``` +For more details, see section Development > Git hooks. + +With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us. + +### Commit message style +Use the following commit message style. e.g: + +```shell +'feat: TT-123 Applying some changes' +'fix: TT-321 Fixing something broken' +'feat(config): TT-00 Fix something in config files' ``` -$env:MS_AUTHORITY="XXX" -$env:MS_CLIENT_ID="XXX" -$env:MS_SCOPE="XXX" -$env:MS_SECRET="XXX" -$env:MS_ENDPOINT="XXX" -$env:DATABASE_ACCOUNT_URI="XXX" -$env:DATABASE_MASTER_KEY="XXX" -$env:DATABASE_NAME="XXX" -$env:FLASK_APP="XXX" -$env:AZURE_APP_CONFIGURATION_CONNECTION_STRING="XXX" -$env:FLASK_DEBUG="True" + +The value `TT-###` refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue. + +### Branch names format + +For example if your task in Jira is **TT-48 implement semantic versioning** your branch name is: + +```shell +TT-48-implement-semantic-versioning ``` -If you use Command Prompt, you should use: +### Test + +We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package +`tests` and use the [conventions for python test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery). + +> Remember To run any available test command we have to have the containers up (`docker-compose up`). + +This command run all tests: +```shell +./time-tracker.sh pytest -v ``` -set "MS_AUTHORITY=XXX" -set "MS_CLIENT_ID=XXX" -set "MS_SCOPE=XXX" -set "MS_SECRET=XXX" -set "MS_ENDPOINT=XXX" -set "DATABASE_ACCOUNT_URI=XXX" -set "DATABASE_MASTER_KEY=XXX" -set "DATABASE_NAME=XXX" -set "FLASK_APP=XXX" -set "AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXX" -set "FLASK_DEBUG=True" + +#### Coverage + +To check the coverage of the tests execute: + +```shell +./time-tracker.sh coverage run -m pytest -v ``` -**Note:** You can create .env (Bash, GitBash), .env.bat (Command Prompt), .env.ps1 (PowerShell) files with environment variables and run them in the corresponding console. +To get a report table: -Important: You should set the environment variables each time the application is run. +```shell +./time-tracker.sh coverage report +``` -### How to use it +To get a full report in html: -- Start the app: +```shell +./time-tracker.sh coverage html +``` +Then check in the [htmlcov/index.html](./htmlcov/index.html) to see it. - ``` - flask run - ``` +If you want that previously collected coverage data is erased, you can execute: -- Open `http://127.0.0.1:5000/` in a browser. You will find in the presented UI - a link to the swagger.json with the definition of the api. +```shell +./time-tracker.sh coverage erase +``` ### Handling Cosmos DB triggers for creating events with time_tracker_events @@ -227,120 +275,6 @@ If you require to deploy `time_tracker_events` from your local machine to Azure func azure functionapp publish time-tracker-events --build local ``` -## Development - -### Git hooks - -We use [pre-commit](https://github.com/pre-commit/pre-commit) library to manage local git hooks, as developers we just need to run in our virtual environment: - -``` -pre-commit install -t pre-commit -t commit-msg -``` - -With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us. - -### Commit message style - -Use the following commit message style. e.g: - -``` -'feat: TT-123 Applying some changes' -'fix: TT-321 Fixing something broken' -'feat(config): TT-00 Fix something in config files' -``` - -The value `TT-###` refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue. - -### Branch names format - -For example if your task in Jira is **TT-48 implement semantic versioning** your branch name is: - -``` - TT-48-implement-semantic-versioning -``` - -### Test - -We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package -`tests` and use the [conventions for python test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery). - -#### Integration tests - -The [integrations tests](https://en.wikipedia.org/wiki/Integration_testing) verifies that all the components of the app -are working well together. These are the default tests we should run: - -This command run all tests: - -```dotenv -python3 -m pytest -v --ignore=tests/commons/data_access_layer/azure/sql_repository_test.py -``` - -In windows - -``` -python -m pytest -v --ignore=tests/commons/data_access_layer/azure/sql_repository_test.py -``` - -**Note:** If you get the error "No module named azure.functions", execute the command: - -``` -pip install azure-functions -``` - -To run a sigle test: - -``` -pytest -v -k name-test -``` - -As you may have noticed we are ignoring the tests related with the repository. - -#### System tests - -In addition to the integration testing we might include tests to the data access layer in order to verify that the -persisted data is being managed the right way, i.e. it actually works. We may classify the execution of all the existing -tests as [system testing](https://en.wikipedia.org/wiki/System_testing): - -```dotenv -python3 -m pytest -v -``` - -The database tests will be done in the table `tests` of the database specified by the variable `SQL_DATABASE_URI`. If this -variable is not specified it will automatically connect to SQLite database in-memory. This will do, because we are using -[SQL Alchemy](https://www.sqlalchemy.org/features.html) to be able connect to any SQL database maintaining the same -codebase. - -The option `-v` shows which tests failed or succeeded. Have into account that you can also debug each test -(test\_\* files) with the help of an IDE like PyCharm. - -#### Coverage - -To check the coverage of the tests execute - -```bash - coverage run -m pytest -v -``` - -To get a report table - -```bash - coverage report -``` - -To get a full report in html - -```bash - coverage html -``` - -Then check in the [htmlcov/index.html](./htmlcov/index.html) to see it. - -If you want that previously collected coverage data is erased, you can execute: - -``` -coverage erase -``` - ### CLI There are available commands, aware of the API, that can be very helpful to you. You @@ -374,22 +308,6 @@ standard commit message style. [python-semantic-release](https://python-semantic-release.readthedocs.io/en/latest/commands.html#publish) for details of underlying operations. -## Run as docker container - -1. Build image - -```bash -docker build -t time_tracker_api:local . -``` - -2. Run app - -```bash -docker run -p 5000:5000 time_tracker_api:local -``` - -3. Visit `127.0.0.1:5000` - ## Migrations Looking for a DB-agnostic migration tool, the only choice I found was [migrate-anything](https://pypi.org/project/migrate-anything/). @@ -438,13 +356,6 @@ They will be automatically run during the Continuous Deployment process. Shared file with all the Feature Toggles we create, so we can have a history of them [Feature Toggles dictionary](https://github.com/ioet/time-tracker-ui/wiki/Feature-Toggles-dictionary) -## Support for docker-compose and cosmosdb emulator - -To run the dev enviroment in docker-compose: -```bash -docker-compose up -``` - ## More information about the project [Starting in Time Tracker](https://github.com/ioet/time-tracker-ui/wiki/Time-tracker) diff --git a/cosmosdb_emulator/README.md b/cosmosdb_emulator/README.md new file mode 100644 index 00000000..840247ee --- /dev/null +++ b/cosmosdb_emulator/README.md @@ -0,0 +1,88 @@ +# Time Tracker CLI + +Here you can find all the source code of the Time Tracker CLI. +This is responsible for automatically generating fake data for the Cosmos emulator, +in order to have information when testing new features or correcting bugs. + +## Prerequisites + +- Backend and cosmos emulator containers up. +- Environment variables correctly configured + +### Environment Variables. + +The main environment variables that you need to take into account are the following: + +```shell +export DATABASE_ACCOUNT_URI=https://azurecosmosemulator:8081 +export DATABASE_MASTER_KEY=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw== +export DATABASE_NAME=time_tracker_testing_database +``` +Verify that the variables are the same as those shown above. + +## How to use Time Tracker CLI? + +If we are in the project's root folder, we need to redirect to the folder `cosmosdb_emulator` and open a terminal. + +We have two main alternatives for running the CLI: + +### Execute CLI with flags. + +In order to see all the available flags for the CLI we are going to execute the following command: + +```shell +./cli.sh main.py --help +``` + +When executing the above command, the following information will be displayed: + +![image](https://user-images.githubusercontent.com/56373098/127604274-041c2af7-d7a8-4b8d-b784-8280773b68c8.png) + +Where you can see the actions we can perform on a given Entity: + +Currently, the CLI only allows the creation of Time-entries and allows the deletion of any entity. + +Available Actions: + +- Create: Allows creating new fake data about a certain entity. +- Delete: Allows creating cosmos data about a certain entity. + +> To delete information about a certain entity you have to take into account the relationship +that this entity has with other entities, since this related information will also be eliminated, +for this purpose the following diagram can be used as a reference: +![image](https://user-images.githubusercontent.com/56373098/127604828-77cc1f90-21d4-4c63-9881-9d6546d84445.png) + +Available Entities: + +- Customers +- Projects +- Project-Types +- Activities +- Time-entries + +Considering the actions that we can execute on the entities we can perform the following command +to generate entries: +```shell +./cli.sh main.py -a Create -e Time-entries +``` + +The result of this command will be as follows: + +![image](https://user-images.githubusercontent.com/56373098/127606245-6cb5a0d1-ada6-4194-bbeb-6bd9679b676b.png) + +In this way we can continue with the generation of entities in an interactive way. + +### Execute CLI in an interactive way + +To run the CLI interactively, we need to execute the following command: + +```shell +./cli.sh main.py +``` +After executing the above command, the following will be displayed: + +![image](https://user-images.githubusercontent.com/56373098/127606606-422c6841-bd40-4f36-be2e-e765d333beed.png) + +This way we can interact dynamically with the CLI for the generation/deletion of entities. + +> Currently, for the generation of personal entries it is necessary to know the identifier of our user within Time Tracker. \ No newline at end of file diff --git a/requirements/time_tracker_api/dev.txt b/requirements/time_tracker_api/dev.txt index 6d8a1599..302acb78 100644 --- a/requirements/time_tracker_api/dev.txt +++ b/requirements/time_tracker_api/dev.txt @@ -14,9 +14,6 @@ pytest-mock==2.0.0 # Coverage coverage==4.5.1 -# Git hooks -pre-commit==2.2.0 - # CLI tools PyInquirer==1.0.3 pyfiglet==0.7 diff --git a/tests/commons/data_access_layer/cosmos_db_test.py b/tests/commons/data_access_layer/cosmos_db_test.py index c7a04eaf..6a84afeb 100644 --- a/tests/commons/data_access_layer/cosmos_db_test.py +++ b/tests/commons/data_access_layer/cosmos_db_test.py @@ -148,41 +148,41 @@ def test_create_with_same_id_but_diff_partition_key_attrib_should_succeed( assert result["id"] == sample_item["id"], "Should have allowed same id" -def test_create_with_mapper_should_provide_calculated_fields( - cosmos_db_repository: CosmosDBRepository, - event_context: EventContext, - tenant_id: str, -): - new_item = dict( - id=fake.uuid4(), - name=fake.name(), - email=fake.safe_email(), - age=fake.pyint(min_value=10, max_value=80), - tenant_id=tenant_id, - ) - - created_item: Person = cosmos_db_repository.create( - new_item, event_context, mapper=Person - ) - - assert created_item is not None - assert all( - item in created_item.__dict__.items() for item in new_item.items() - ) - assert ( - type(created_item) is Person - ), "The result should be wrapped with a class" - assert created_item.is_adult() is (new_item["age"] >= 18) - - -def test_find_by_valid_id_should_succeed( - cosmos_db_repository: CosmosDBRepository, - sample_item: dict, - event_context: EventContext, -): - found_item = cosmos_db_repository.find(sample_item["id"], event_context) - - assert all(item in found_item.items() for item in sample_item.items()) +# def test_create_with_mapper_should_provide_calculated_fields( +# cosmos_db_repository: CosmosDBRepository, +# event_context: EventContext, +# tenant_id: str, +# ): +# new_item = dict( +# id=fake.uuid4(), +# name=fake.name(), +# email=fake.safe_email(), +# age=fake.pyint(min_value=10, max_value=80), +# tenant_id=tenant_id, +# ) +# +# created_item: Person = cosmos_db_repository.create( +# new_item, event_context, mapper=Person +# ) +# +# assert created_item is not None +# assert all( +# item in created_item.__dict__.items() for item in new_item.items() +# ) +# assert ( +# type(created_item) is Person +# ), "The result should be wrapped with a class" +# assert created_item.is_adult() is (new_item["age"] >= 18) + + +# def test_find_by_valid_id_should_succeed( +# cosmos_db_repository: CosmosDBRepository, +# sample_item: dict, +# event_context: EventContext, +# ): +# found_item = cosmos_db_repository.find(sample_item["id"], event_context) +# +# assert all(item in found_item.items() for item in sample_item.items()) def test_find_by_invalid_id_should_fail( diff --git a/time-tracker.sh b/time-tracker.sh new file mode 100644 index 00000000..fe6b0068 --- /dev/null +++ b/time-tracker.sh @@ -0,0 +1,11 @@ +#!/bin/sh +COMMAND=$@ +PYTHON_COMMAND="pip install azure-functions" +API_CONTAINER_NAME="time-tracker-backend_api" + +execute(){ + docker exec -ti $API_CONTAINER_NAME sh -c "$PYTHON_COMMAND" + docker exec -ti $API_CONTAINER_NAME sh -c "$COMMAND" +} + +execute \ No newline at end of file From f75ba5d4397710b7a82f7547db1bbf332f61eefb Mon Sep 17 00:00:00 2001 From: jcalarcon98 Date: Mon, 2 Aug 2021 18:25:52 -0500 Subject: [PATCH 3/4] fix: TT-301 Remove unnecesary comments inside cosmosdb tests --- .../data_access_layer/cosmos_db_test.py | 70 +++++++++---------- 1 file changed, 35 insertions(+), 35 deletions(-) diff --git a/tests/commons/data_access_layer/cosmos_db_test.py b/tests/commons/data_access_layer/cosmos_db_test.py index 6a84afeb..c7a04eaf 100644 --- a/tests/commons/data_access_layer/cosmos_db_test.py +++ b/tests/commons/data_access_layer/cosmos_db_test.py @@ -148,41 +148,41 @@ def test_create_with_same_id_but_diff_partition_key_attrib_should_succeed( assert result["id"] == sample_item["id"], "Should have allowed same id" -# def test_create_with_mapper_should_provide_calculated_fields( -# cosmos_db_repository: CosmosDBRepository, -# event_context: EventContext, -# tenant_id: str, -# ): -# new_item = dict( -# id=fake.uuid4(), -# name=fake.name(), -# email=fake.safe_email(), -# age=fake.pyint(min_value=10, max_value=80), -# tenant_id=tenant_id, -# ) -# -# created_item: Person = cosmos_db_repository.create( -# new_item, event_context, mapper=Person -# ) -# -# assert created_item is not None -# assert all( -# item in created_item.__dict__.items() for item in new_item.items() -# ) -# assert ( -# type(created_item) is Person -# ), "The result should be wrapped with a class" -# assert created_item.is_adult() is (new_item["age"] >= 18) - - -# def test_find_by_valid_id_should_succeed( -# cosmos_db_repository: CosmosDBRepository, -# sample_item: dict, -# event_context: EventContext, -# ): -# found_item = cosmos_db_repository.find(sample_item["id"], event_context) -# -# assert all(item in found_item.items() for item in sample_item.items()) +def test_create_with_mapper_should_provide_calculated_fields( + cosmos_db_repository: CosmosDBRepository, + event_context: EventContext, + tenant_id: str, +): + new_item = dict( + id=fake.uuid4(), + name=fake.name(), + email=fake.safe_email(), + age=fake.pyint(min_value=10, max_value=80), + tenant_id=tenant_id, + ) + + created_item: Person = cosmos_db_repository.create( + new_item, event_context, mapper=Person + ) + + assert created_item is not None + assert all( + item in created_item.__dict__.items() for item in new_item.items() + ) + assert ( + type(created_item) is Person + ), "The result should be wrapped with a class" + assert created_item.is_adult() is (new_item["age"] >= 18) + + +def test_find_by_valid_id_should_succeed( + cosmos_db_repository: CosmosDBRepository, + sample_item: dict, + event_context: EventContext, +): + found_item = cosmos_db_repository.find(sample_item["id"], event_context) + + assert all(item in found_item.items() for item in sample_item.items()) def test_find_by_invalid_id_should_fail( From 8090d33552aa613d34f452b7ef69b7b1c5efae57 Mon Sep 17 00:00:00 2001 From: jcalarcon98 Date: Wed, 4 Aug 2021 13:33:57 -0500 Subject: [PATCH 4/4] docs: TT-301 Improve README documentation --- README.md | 272 +++++++++++++++++++++++++++++------- cosmosdb_emulator/README.md | 2 + 2 files changed, 226 insertions(+), 48 deletions(-) diff --git a/README.md b/README.md index d52e23b6..49dec648 100644 --- a/README.md +++ b/README.md @@ -6,17 +6,60 @@ This is the mono-repository for the backend services and their common codebase ## Getting started -Follow the following instructions to get the project ready to use ASAP. +Follow the next instructions to get the project ready to use ASAP. -### Requirements +Currently, there are two ways to run the project, the production mode using a virtual environment and installing all the necessary libraries +there and the other way is using the development mode with Docker and docker-compose. It is recommended to use the development mode and in special cases the production mode. -Be sure you have installed in your system: +## Requirements: + +For both modes it is necessary to have the following requirements installed: - [Python version 3](https://www.python.org/download/releases/3.0/) (recommended 3.8 or less) in your path. It will install automatically [pip](https://pip.pypa.io/en/stable/) as well. - A virtual environment, namely [.venv](https://docs.python.org/3/library/venv.html). - Optionally for running Azure functions locally: [Azure functions core tool](https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=macos%2Ccsharp%2Cbash). +## Settings for each mode + +Before proceeding to the configuration for each of the modes, +it is important to perform the following step regardless of the mode to be used. + +### Create a virtual environment + +Execute the next command at the root of the project: + +```shell +python -m venv .venv +``` + +> **Note:** We can replace python for python3 or python3.8 according to the version you have installed, +> but do not forget the initial requirements. + +**Activate the environment** + +Windows: +```shell +.venv\Scripts\activate.bat +``` + +In Unix based operative systems: + +```shell +source .venv/bin/activate +``` + +### Setup for each mode + +The configuration required for each of the modes is as follows: + +
+ Development Mode + +### Requirements: + +In addition to the initial requirements, it is necessary to have the following requirements installed: + - Docker You can follow the instructions below to install on each of the following operating systems: @@ -28,7 +71,7 @@ Be sure you have installed in your system: To install Docker Compose, please choose the operating system you use and follow the steps [here](https://docs.docker.com/compose/install/). -### Setup +### Setup Once installed Docker and Docker Compose we must create a `.env` file in the root of our project where we will put the following environment variables. @@ -56,7 +99,7 @@ Once all the project configuration is done, we are going to execute the followin docker-compose up --build ``` -This command will build all images with the necessary configurations for each one, aslo +This command will build all images with the necessary configurations for each one, also raises the cosmos emulator in combination with the backend, now you can open in the browser: - `http://127.0.0.1:5000/` open backend API. @@ -66,12 +109,12 @@ raises the cosmos emulator in combination with the backend, now you can open in > it is not necessary to execute it again, instead it should be executed like this: > `docker-compose up` -> It is also important to clarify that if packages or any extra configuration is added to the images construction, +> It is also important to clarify that if packages or any extra configuration is added to the image's construction, > you need to run again `docker-compose up --build`, you can see more information about this flag [here](https://docs.docker.com/compose/reference/up/) -## Development +### Development -### Generate Fake Data +#### Generate Fake Data In order to generate fake data to test functionalities or correct errors, we have built a CLI, called 'Time Tracker CLI', which is in charge of generating @@ -79,86 +122,164 @@ the fake information inside the Cosmos emulator. To learn how this CLI works, you can see the instructions [here](https://github.com/ioet/time-tracker-backend/tree/master/cosmosdb_emulator) -### Git hooks +> It is important to clarify that Time Tracker CLI only works in development mode. -We use [pre-commit](https://github.com/pre-commit/pre-commit) library to manage local git hooks, -as developers we just need to run in our virtual environment. +### Test -This library allows you to execute code right before the commit, for example: -- Check if the commit contains the correct formatting. -- Format modified files based on a Style Guide such as PEP 8, etc. +We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package +`tests` and use the [conventions for python test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery). -To install and use `pre-commit` we have to perform the following steps: +> Remember to run any available test command we have to have the containers up (`docker-compose up`). -**Create the environment** +This command run all tests: -Execute the next command at the root of the project: +```shell +./time-tracker.sh pytest -v +``` + +Run a single test: ```shell -python -m venv .venv +./time-tracker.sh pytest -v -k name-test ``` -> **Note:** We can replace python for python3 or python3.8 according to the version you have installed, -> but do not forget the initial requirements. +#### Coverage -**Activate the environment** +To check the coverage of the tests execute: -Windows: ```shell -.venv\Scripts\activate.bat +./time-tracker.sh coverage run -m pytest -v ``` -In Unix based operative systems: +To get a report table: ```shell -source .venv/bin/activate +./time-tracker.sh coverage report ``` -Once the environment has been created and activated we have to run: +To get a full report in html: + ```shell -python3 -m pip install pre-commit +./time-tracker.sh coverage html ``` +Then check in the [htmlcov/index.html](./htmlcov/index.html) to see it. -Once `pre-commit` library is installed, we are going to execute the following command: +If you want that previously collected coverage data is erased, you can execute: ```shell -pre-commit install -t pre-commit -t commit-msg +./time-tracker.sh coverage erase ``` -For more details, see section Development > Git hooks. -With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us. +
-### Commit message style +
-Use the following commit message style. e.g: +
+ Production Mode -```shell -'feat: TT-123 Applying some changes' -'fix: TT-321 Fixing something broken' -'feat(config): TT-00 Fix something in config files' +### Setup + +#### Install the requirements: + +``` +python3 -m pip install -r requirements//.txt ``` -The value `TT-###` refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue. +If you use Windows, you will use this command: -### Branch names format +``` +python -m pip install -r requirements//.txt +``` -For example if your task in Jira is **TT-48 implement semantic versioning** your branch name is: +Where `` is one of the executable app namespace, e.g. `time_tracker_api` or `time_tracker_events` (**Note:** Currently, only `time_tracker_api` is used.). The `stage` can be + +- `dev`: Used for working locally +- `prod`: For anything deployed + +Bear in mind that the requirements for `time_tracker_events`, must be located on its local requirements.txt, by +[convention](https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-python#folder-structure). + +### Set environment variables + +When you use Bash or GitBash you should create a .env file and add the next variables: + +``` +export MS_AUTHORITY=XXX +export MS_CLIENT_ID=XXX +export MS_SCOPE=XXX +export MS_SECRET=XXX +export MS_ENDPOINT=XXX +export DATABASE_ACCOUNT_URI=XXX +export DATABASE_MASTER_KEY=XXX +export DATABASE_NAME=XXX +export FLASK_APP=XXX +export AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXX +export FLASK_DEBUG=True +``` + +If you use PowerShell, you should create a .env.bat file and add the next variables: + +``` +$env:MS_AUTHORITY="XXX" +$env:MS_CLIENT_ID="XXX" +$env:MS_SCOPE="XXX" +$env:MS_SECRET="XXX" +$env:MS_ENDPOINT="XXX" +$env:DATABASE_ACCOUNT_URI="XXX" +$env:DATABASE_MASTER_KEY="XXX" +$env:DATABASE_NAME="XXX" +$env:FLASK_APP="XXX" +$env:AZURE_APP_CONFIGURATION_CONNECTION_STRING="XXX" +$env:FLASK_DEBUG="True" +``` + +If you use Command Prompt, you should create a .env.ps1 file and add the next variables: + +``` +set "MS_AUTHORITY=XXX" +set "MS_CLIENT_ID=XXX" +set "MS_SCOPE=XXX" +set "MS_SECRET=XXX" +set "MS_ENDPOINT=XXX" +set "DATABASE_ACCOUNT_URI=XXX" +set "DATABASE_MASTER_KEY=XXX" +set "DATABASE_NAME=XXX" +set "FLASK_APP=XXX" +set "AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXX" +set "FLASK_DEBUG=True" +``` + +> **Important:** Ask the development team for the values of the environment variables, also +> you should set the environment variables each time the application is run. + +### Run application + +- Start the app: ```shell -TT-48-implement-semantic-versioning +flask run ``` +- Open `http://127.0.0.1:5000/` in a browser. You will find in the presented UI + a link to the swagger.json with the definition of the api. + ### Test We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package `tests` and use the [conventions for python test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery). -> Remember To run any available test command we have to have the containers up (`docker-compose up`). - This command run all tests: ```shell -./time-tracker.sh pytest -v +pytest -v +``` + +> **Note:** If you get the error "No module named azure.functions", execute the command `pip install azure-functions`: + +To run a single test: + +```shell +pytest -v -k name-test ``` #### Coverage @@ -166,26 +287,81 @@ This command run all tests: To check the coverage of the tests execute: ```shell -./time-tracker.sh coverage run -m pytest -v +coverage run -m pytest -v ``` To get a report table: ```shell -./time-tracker.sh coverage report +coverage report ``` To get a full report in html: ```shell -./time-tracker.sh coverage html +coverage html ``` Then check in the [htmlcov/index.html](./htmlcov/index.html) to see it. If you want that previously collected coverage data is erased, you can execute: ```shell -./time-tracker.sh coverage erase +coverage erase +``` + +
+ +
+ +### Git hooks +We use [pre-commit](https://github.com/pre-commit/pre-commit) library to manage local git hooks, +as developers we just need to run in our virtual environment. + +
+
+ Open if you use Development mode + +To install and use `pre-commit` in development mode we have to perform the next command: + +```shell +python3 -m pip install pre-commit +``` +> Remember to execute this command with the virtual environment active. + +Once `pre-commit` library is installed, we can continue with the guide +
+ +
+This library allows you to execute code right before the commit, for example: +- Check if the commit contains the correct formatting. +- Format modified files based on a Style Guide such as PEP 8, etc. + +As developers, we just need to run in our virtual environment: +```shell +pre-commit install -t pre-commit -t commit-msg +``` +For more details, see section Development > Git hooks. + +With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us. + +### Commit message style + +Use the following commit message style. e.g: + +```shell +'feat: TT-123 Applying some changes' +'fix: TT-321 Fixing something broken' +'feat(config): TT-00 Fix something in config files' +``` + +The value `TT-###` refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue. + +### Branch names format + +For example if your task in Jira is **TT-48 implement semantic versioning** your branch name is: + +```shell +TT-48-implement-semantic-versioning ``` ### Handling Cosmos DB triggers for creating events with time_tracker_events diff --git a/cosmosdb_emulator/README.md b/cosmosdb_emulator/README.md index 840247ee..a5ebc402 100644 --- a/cosmosdb_emulator/README.md +++ b/cosmosdb_emulator/README.md @@ -4,6 +4,8 @@ Here you can find all the source code of the Time Tracker CLI. This is responsible for automatically generating fake data for the Cosmos emulator, in order to have information when testing new features or correcting bugs. +> This feature is only available in development mode. + ## Prerequisites - Backend and cosmos emulator containers up.