Skip to content

ioet/time-tracker-backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

time-tracker-api

Build status

This is the mono-repository for the backend services and their common codebase

Getting started

Follow the following instructions to get the project ready to use ASAP.

Requirements

Be sure you have installed in your system:

  • Python version 3 (recommended 3.8 or less) in your path. It will install automatically pip as well.

  • A virtual environment, namely .venv.

  • Optionally for running Azure functions locally: Azure functions core tool.

  • Docker

    You can follow the instructions below to install on each of the following operating systems:

  • Docker Compose

    To install Docker Compose, please choose the operating system you use and follow the steps here.

Setup

Once installed Docker and Docker Compose we must create a .env file in the root of our project where we will put the following environment variables.

export MS_AUTHORITY=XXXX
export MS_CLIENT_ID=XXXX
export MS_SCOPE=XXXX
export MS_SECRET=yFo=XXXX
export MS_ENDPOINT=XXXX
export DATABASE_ACCOUNT_URI=XXXX
export DATABASE_MASTER_KEY=XXXX
export DATABASE_NAME=XXXX
export FLASK_APP=XXXX
export AZURE_APP_CONFIGURATION_CONNECTION_STRING=XXXX
export FLASK_DEBUG=XXXX
export REQUESTS_CA_BUNDLE=XXXX

Please, contact the project development team for the content of the variables mentioned above.

Run containers

Once all the project configuration is done, we are going to execute the following command in the terminal, taking into account that we are inside the root folder of the project:

docker-compose up --build

This command will build all images with the necessary configurations for each one, aslo raises the cosmos emulator in combination with the backend, now you can open in the browser:

  • http://127.0.0.1:5000/ open backend API.
  • https://127.0.0.1:8081/_explorer/index.html to open Cosmos DB emulator.

If you have already executed the command (docker-compose up --build) previously in this project, it is not necessary to execute it again, instead it should be executed like this: docker-compose up

It is also important to clarify that if packages or any extra configuration is added to the images construction, you need to run again docker-compose up --build, you can see more information about this flag here

Development

Generate Fake Data

In order to generate fake data to test functionalities or correct errors, we have built a CLI, called 'Time Tracker CLI', which is in charge of generating the fake information inside the Cosmos emulator.

To learn how this CLI works, you can see the instructions here

Git hooks

We use pre-commit library to manage local git hooks, as developers we just need to run in our virtual environment.

This library allows you to execute code right before the commit, for example:

  • Check if the commit contains the correct formatting.
  • Format modified files based on a Style Guide such as PEP 8, etc.

To install and use pre-commit we have to perform the following steps:

Create the environment

Execute the next command at the root of the project:

python -m venv .venv

Note: We can replace python for python3 or python3.8 according to the version you have installed, but do not forget the initial requirements.

Activate the environment

Windows:

.venv\Scripts\activate.bat

In Unix based operative systems:

source .venv/bin/activate

Once the environment has been created and activated we have to run:

python3 -m pip install pre-commit

Once pre-commit library is installed, we are going to execute the following command:

pre-commit install -t pre-commit -t commit-msg

For more details, see section Development > Git hooks.

With this command the library will take configuration from .pre-commit-config.yaml and will set up the hooks by us.

Commit message style

Use the following commit message style. e.g:

'feat: TT-123 Applying some changes'
'fix: TT-321 Fixing something broken'
'feat(config): TT-00 Fix something in config files'

The value TT-### refers to the Jira issue that is being solved. Use TT-00 if the commit does not refer to any issue.

Branch names format

For example if your task in Jira is TT-48 implement semantic versioning your branch name is:

TT-48-implement-semantic-versioning

Test

We are using Pytest for tests. The tests are located in the package tests and use the conventions for python test discovery.

Remember To run any available test command we have to have the containers up (docker-compose up).

This command run all tests:

./time-tracker.sh pytest -v

Coverage

To check the coverage of the tests execute:

./time-tracker.sh coverage run -m pytest -v

To get a report table:

./time-tracker.sh coverage report

To get a full report in html:

./time-tracker.sh coverage html

Then check in the htmlcov/index.html to see it.

If you want that previously collected coverage data is erased, you can execute:

./time-tracker.sh coverage erase

Handling Cosmos DB triggers for creating events with time_tracker_events

The project time_tracker_events is an Azure Function project. Its main responsibility is to respond to calls related to events, like those triggered by Change Feed. Every time a write action (create, update, soft-delete) is done by CosmosDB, thanks to bindings these functions will be called. You can also run them in your local machine:

az login
  • Execute the project
cd time_tracker_events
source run.sh

You will see that a large console log will appear ending with a message like

Now listening on: http://0.0.0.0:7071
Application started. Press Ctrl+C to shut down.
  • Now you are ready to start generating events. Just execute any change in your API and you will see how logs are being generated by the console app you ran before. For instance, this is the log generated when I restarted a time entry:
[04/30/2020 14:42:12] Executing 'Functions.handle_time_entry_events_trigger' (Reason='New changes on collection time_entry at 2020-04-30T14:42:12.1465310Z', Id=3da87e53-0434-4ff2-8db3-f7c051ccf9fd)
[04/30/2020 14:42:12]  INFO: Received FunctionInvocationRequest, request ID: 578e5067-b0c0-42b5-a1a4-aac858ea57c0, function ID: c8ac3c4c-fefd-4db9-921e-661b9010a4d9, invocation ID: 3da87e53-0434-4ff2-8db3-f7c051ccf9fd
[04/30/2020 14:42:12]  INFO: Successfully processed FunctionInvocationRequest, request ID: 578e5067-b0c0-42b5-a1a4-aac858ea57c0, function ID: c8ac3c4c-fefd-4db9-921e-661b9010a4d9, invocation ID: 3da87e53-0434-4ff2-8db3-f7c051ccf9fd
[04/30/2020 14:42:12] {"id": "9ac108ff-c24d-481e-9c61-b8a3a0737ee8", "project_id": "c2e090fb-ae8b-4f33-a9b8-2052d67d916b", "start_date": "2020-04-28T15:20:36.006Z", "tenant_id": "cc925a5d-9644-4a4f-8d99-0bee49aadd05", "owner_id": "709715c1-6d96-4ecc-a951-b628f2e7d89c", "end_date": null, "_last_event_ctx": {"user_id": "709715c1-6d96-4ecc-a951-b628f2e7d89c", "tenant_id": "cc925a5d-9644-4a4f-8d99-0bee49aadd05", "action": "update", "description": "Restart time entry", "container_id": "time_entry", "session_id": null}, "description": "Changing my description for testing Change Feed", "_metadata": {}}
[04/30/2020 14:42:12] Executed 'Functions.handle_time_entry_events_trigger' (Succeeded, Id=3da87e53-0434-4ff2-8db3-f7c051ccf9fd)

Security

In this API we are requiring authenticated users using JWT. To do so, we are using the library PyJWT, so in every request to the API we expect a header Authorization with a format like:

Bearer

In the Swagger UI, you will now see a new button called "Authorize": image

when you click it then you will be notified that you must enter the content of the Authorization header, as mentioned before: image

Click "Authorize" and then close that dialog. From that moment forward you will not have to do it anymore because the Swagger UI will use that JWT in every call, e.g. image

If you want to check out the data (claims) that your JWT contains, you can also use the CLI of PyJWT:

pyjwt decode --no-verify "<JWT>"

Bear in mind that this API is not in charge of verifying the authenticity of the JWT, but the API Management.

Important notes

Due to the used technology and particularities on the implementation of this API, it is important that you respect the following notes regarding to the manipulation of the data from and towards the API:

  • The recommended format for DateTime strings in Azure Cosmos DB is YYYY-MM-DDThh:mm:ss.fffffffZ which follows the ISO 8601 UTC standard.

The Azure function project time_tracker_events also have some constraints to have into account. It is recommended that you read the Azure Functions Python developer guide.

If you require to deploy time_tracker_events from your local machine to Azure Functions, you can execute:

func azure functionapp publish time-tracker-events  --build local

CLI

There are available commands, aware of the API, that can be very helpful to you. You can check them out by running

python cli.py

If you want to run an specific command, e.g. gen_swagger_json, specify it as a param as well as its correspondent options.

python cli.py gen_swagger_json -f ~/Downloads/swagger.json

Semantic versioning

Style

We use angular commit message style as the standard commit message style.

Release

  1. The release is automatically done by the TimeTracker CI although can also be done manually. The variable GH_TOKEN is required to post releases to Github. The GH_TOKEN can be generated following these steps.

  2. We use the command semantic-release publish after a successful PR to make a release. Check the library python-semantic-release for details of underlying operations.

Migrations

Looking for a DB-agnostic migration tool, the only choice I found was migrate-anything. A specific requirement file was created to run the migrations in requirements/migrations.txt. This way we do not mix any possible vulnerable dependency brought by these dependencies to the environment prod. Therefore the dependencies to run the migrations shall be installed this way:

pip install -r requirements/<app>/prod.txt
pip install -r requirements/migrations.txt

All the migrations will be handled and created in the python package migrations. In order to create a migration we must do it manually (for now) and prefixed by a number, e.g. migrations/01-initialize-db.py in order to guarantee the order of execution alphabetically. Inside every migration there is an up and down method. The down method is executed from the persisted migration in the database. When a down logic that used external dependencies was tested, it failed; whilst, I put that same logic in the up method, it run correctly. In general, the library seems to present design issues. Therefore, it is recommended to apply changes just in one direction: up. For more information, please check out some examples that illustrate the usage of this migration tool.

Basically, for running the migrations you must execute:

migrate-anything migrations

They will be automatically run during the Continuous Deployment process.

Built with

Feature Toggles dictionary

Shared file with all the Feature Toggles we create, so we can have a history of them Feature Toggles dictionary

More information about the project

Starting in Time Tracker

License

Copyright 2020 ioet Inc. All Rights Reserved.

About

API to provide data to the time tracker project

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 37

Languages