Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
105 changes: 67 additions & 38 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,15 @@ automatically [pip](https://pip.pypa.io/en/stable/) as well.

### Setup
- Create and activate the environment,

In Windows:
```

```
python -m venv .venv
.venv\Scripts\activate.bat
```
In Unix based operative systems:

In Unix based operative systems:
```
virtualenv .venv
source .venv/bin/activate
Expand All @@ -42,26 +42,44 @@ Type in the terminal the following command to install the required dependency to
```sh
sudo apt-get install unixodbc-dev
```

- Install the requirements:
```
python3 -m pip install -r requirements/<app>/<stage>.txt
```

Where `<app>` is one of the executable app namespace, e.g. `time_tracker_api` or `time_tracker_events`.
The `stage` can be
The `stage` can be

* `dev`: Used for working locally
* `prod`: For anything deployed


Remember to do it with Python 3.
Bear in mind that the requirements for `time_tracker_events`, must be located on its local requirements.txt, by

Bear in mind that the requirements for `time_tracker_events`, must be located on its local requirements.txt, by
[convention](https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-python#folder-structure).

- Run `pre-commit install -t pre-commit -t commit-msg`. For more details, see section Development > Git hooks.


### Set environment variables
Set environment variables with the content pinned in our slack channel #time-tracker-developer:

```
export MS_AUTHORITY=XXX
export MS_CLIENT_ID=XXX
export MS_SCOPE=XXX
export MS_SECRET=XXX
export MS_ENDPOINT=XXX
export DATABASE_ACCOUNT_URI=XXX
export DATABASE_MASTER_KEY=XXX
export DATABASE_NAME=XXX
export FLASK_APP=XXX
export AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXX
export FLASK_DEBUG=True
```

### How to use it
- Set the env var `FLASK_APP` to `time_tracker_api` and start the app:

Expand All @@ -70,13 +88,13 @@ Bear in mind that the requirements for `time_tracker_events`, must be located on
set FLASK_APP=time_tracker_api
flask run
```
In Unix based operative systems:
In Unix based operative systems:
```
export FLASK_APP=time_tracker_api
flask run
```

- Open `http://127.0.0.1:5000/` in a browser. You will find in the presented UI
- Open `http://127.0.0.1:5000/` in a browser. You will find in the presented UI
a link to the swagger.json with the definition of the api.

#### Handling Cosmos DB triggers for creating events with time_tracker_events
Expand Down Expand Up @@ -114,24 +132,24 @@ generated by the console app you ran before. For instance, this is the log gener
```

### Security
In this API we are requiring authenticated users using JWT. To do so, we are using the library
[PyJWT](https://pypi.org/project/PyJWT/), so in every request to the API we expect a header `Authorization` with a format
In this API we are requiring authenticated users using JWT. To do so, we are using the library
[PyJWT](https://pypi.org/project/PyJWT/), so in every request to the API we expect a header `Authorization` with a format
like:

>Bearer <JWT>

In the Swagger UI, you will now see a new button called "Authorize":
![image](https://user-images.githubusercontent.com/6514740/80011459-841f7580-8491-11ea-9c23-5bfb8822afe6.png)

when you click it then you will be notified that you must enter the content of the Authorization header, as mentioned
when you click it then you will be notified that you must enter the content of the Authorization header, as mentioned
before:
![image](https://user-images.githubusercontent.com/6514740/80011702-d95b8700-8491-11ea-973a-8aaf3cdadb00.png)

Click "Authorize" and then close that dialog. From that moment forward you will not have to do it anymore because the
Click "Authorize" and then close that dialog. From that moment forward you will not have to do it anymore because the
Swagger UI will use that JWT in every call, e.g.
![image](https://user-images.githubusercontent.com/6514740/80011852-0e67d980-8492-11ea-9dd3-2b1efeaa57d8.png)

If you want to check out the data (claims) that your JWT contains, you can also use the CLI of
If you want to check out the data (claims) that your JWT contains, you can also use the CLI of
[PyJWT](https://pypi.org/project/PyJWT/):
```
pyjwt decode --no-verify "<JWT>"
Expand All @@ -143,7 +161,7 @@ Bear in mind that this API is not in charge of verifying the authenticity of the
Due to the used technology and particularities on the implementation of this API, it is important that you respect the
following notes regarding to the manipulation of the data from and towards the API:

- The [recommended](https://docs.microsoft.com/en-us/azure/cosmos-db/working-with-dates#storing-datetimes) format for
- The [recommended](https://docs.microsoft.com/en-us/azure/cosmos-db/working-with-dates#storing-datetimes) format for
DateTime strings in Azure Cosmos DB is `YYYY-MM-DDThh:mm:ss.fffffffZ` which follows the ISO 8601 **UTC standard**.

The Azure function project `time_tracker_events` also have some constraints to have into account. It is recommended that
Expand All @@ -164,7 +182,7 @@ We use [pre-commit](https://github.com/pre-commit/pre-commit) library to manage
```
pre-commit install -t pre-commit -t commit-msg
```
With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us.
With this command the library will take configuration from `.pre-commit-config.yaml` and will set up the hooks by us.


### Commit message style
Expand All @@ -176,27 +194,35 @@ Use the following commit message style. e.g:
```
The value `TT-###` refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue.

### Branch names format
For example if your task in Jira is **TT-48 implement semantic versioning** your branch name is:
```
TT-48-implement-semantic-versioning
```

### Test
We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package
We are using [Pytest](https://docs.pytest.org/en/latest/index.html) for tests. The tests are located in the package
`tests` and use the [conventions for python test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery).

#### Integration tests
The [integrations tests](https://en.wikipedia.org/wiki/Integration_testing) verifies that all the components of the app
are working well together. These are the default tests we should run:

This command run all tests:
```dotenv
python3 -m pytest -v --ignore=tests/commons/data_access_layer/azure/sql_repository_test.py
```

```to run a sigle test
To run a sigle test:
```
pytest -v -k name-test
```

As you may have noticed we are ignoring the tests related with the repository.


#### System tests
In addition to the integration testing we might include tests to the data access layer in order to verify that the
In addition to the integration testing we might include tests to the data access layer in order to verify that the
persisted data is being managed the right way, i.e. it actually works. We may classify the execution of all the existing
tests as [system testing](https://en.wikipedia.org/wiki/System_testing):

Expand All @@ -205,12 +231,12 @@ python3 -m pytest -v
```

The database tests will be done in the table `tests` of the database specified by the variable `SQL_DATABASE_URI`. If this
variable is not specified it will automatically connect to SQLite database in-memory. This will do, because we are using
[SQL Alchemy](https://www.sqlalchemy.org/features.html) to be able connect to any SQL database maintaining the same
variable is not specified it will automatically connect to SQLite database in-memory. This will do, because we are using
[SQL Alchemy](https://www.sqlalchemy.org/features.html) to be able connect to any SQL database maintaining the same
codebase.


The option `-v` shows which tests failed or succeeded. Have into account that you can also debug each test
The option `-v` shows which tests failed or succeeded. Have into account that you can also debug each test
(test_* files) with the help of an IDE like PyCharm.

#### Coverage
Expand All @@ -220,7 +246,7 @@ To check the coverage of the tests execute
coverage run -m pytest -v
```

To get a report table
To get a report table

```bash
coverage report
Expand Down Expand Up @@ -258,16 +284,16 @@ python cli.py gen_swagger_json -f ~/Downloads/swagger.json
## Semantic versioning

### Style
We use [angular commit message style](https://github.com/angular/angular.js/blob/master/DEVELOPERS.md#commits) as the
We use [angular commit message style](https://github.com/angular/angular.js/blob/master/DEVELOPERS.md#commits) as the
standard commit message style.

### Release
1. The release is automatically done by the [TimeTracker CI](https://dev.azure.com/IOET-DevOps/TimeTracker-API/_build?definitionId=1&_a=summary)
although can also be done manually. The variable `GH_TOKEN` is required to post releases to Github. The `GH_TOKEN` can
1. The release is automatically done by the [TimeTracker CI](https://dev.azure.com/IOET-DevOps/TimeTracker-API/_build?definitionId=1&_a=summary)
although can also be done manually. The variable `GH_TOKEN` is required to post releases to Github. The `GH_TOKEN` can
be generated following [these steps](https://help.github.com/es/github/authenticating-to-github/creating-a-personal-access-token-for-the-command-line).

2. We use the command `semantic-release publish` after a successful PR to make a release. Check the library
[python-semantic-release](https://python-semantic-release.readthedocs.io/en/latest/commands.html#publish) for details of
[python-semantic-release](https://python-semantic-release.readthedocs.io/en/latest/commands.html#publish) for details of
underlying operations.

## Run as docker container
Expand All @@ -294,14 +320,14 @@ pip install -r requirements/<app>/prod.txt
pip install -r requirements/migrations.txt
```

All the migrations will be handled and created in the python package `migrations`. In order to create a migration we
must do it manually (for now) and prefixed by a number, e.g. `migrations/01-initialize-db.py` in order to guarantee the
All the migrations will be handled and created in the python package `migrations`. In order to create a migration we
must do it manually (for now) and prefixed by a number, e.g. `migrations/01-initialize-db.py` in order to guarantee the
order of execution alphabetically.
Inside every migration there is an `up` and `down` method. The `down` method is executed from the persisted migration in
the database. When a `down` logic that used external dependencies was tested, it failed; whilst, I put that same logic in
the database. When a `down` logic that used external dependencies was tested, it failed; whilst, I put that same logic in
the `up` method, it run correctly. In general, the library seems to present [design issues](https://github.com/Lieturd/migrate-anything/issues/3).
Therefore, it is recommended to apply changes just in one direction: `up`.
For more information, please check out [some examples](https://github.com/Lieturd/migrate-anything/tree/master/examples)
For more information, please check out [some examples](https://github.com/Lieturd/migrate-anything/tree/master/examples)
that illustrate the usage of this migration tool.

Basically, for running the migrations you must execute:
Expand All @@ -314,7 +340,7 @@ They will be automatically run during the Continuous Deployment process.


## Built with
- [Python version 3](https://www.python.org/download/releases/3.0/) as backend programming language. Strong typing for
- [Python version 3](https://www.python.org/download/releases/3.0/) as backend programming language. Strong typing for
the win.
- [Flask](http://flask.pocoo.org/) as the micro framework of choice.
- [Flask RestPlus](https://flask-restplus.readthedocs.io/en/stable/) for building Restful APIs with Swagger.
Expand All @@ -331,6 +357,9 @@ for making `time_tracker_events` to handle the triggers [generated by our Cosmos
Shared file with all the Feature Toggles we create, so we can have a history of them
[Feature Toggles dictionary](https://github.com/ioet/time-tracker-ui/wiki/Feature-Toggles-dictionary)

## More information about the project
[Starting in Time Tracker](https://github.com/ioet/time-tracker-ui/wiki/Time-tracker)

## License

Copyright 2020 ioet Inc. All Rights Reserved.