I've copied most of the descriptions from what we did for batch metrics
under the Edge tag, however there are a couple of top level descriptions
that are "fine" but I'm definitely open to suggestions.
---------
Co-authored-by: Thomas Heartman <thomas@getunleash.ai>
* Docs: start experimenting with OpenAPI and docusaurus
* Docs: add docusaurus-theme-openapi-docs pkg
* Wip: current status
* Docs: Add 'docusaurus-plugin-api-docs'
* Move openapi into own sidebar; generate from localhost
* Chore: Update docusaurus plugin for OpenAPI
* Add website/yarn.lock to git
* Fix: fix CSS warning by using flex-end instead of end
* docs: make openapi generated code work again
* docs: make tags work properly with openapi sidebar
* Docs/chore: update OpenAPI tag scheme.
Add a whole bunch of new tags to make it easier to understand
available tags in OpenAPI.
* docs: point to new openapi docs from old api docs
* docs: typo
* Docs: link restructure
* docs: add operation indicators to openapi docs
* docs: change badge color for operations
* docs: update openapi-docs package
It now sorts tags the same as the schema
* docs: pluralize APIs in slug
* docs: update links to generated api docs
* docs: update openapi snapshot tests with new tags
* docs: conditionally load spec from localhost or from file
* docs: Remove changes relating to immediate switchover
* refactor: rename types; extract into separate file
* docs: fix api doc links
* Refactor: move openapi utils into /util directory
* Refactor: move utils test into `util` directory
* Refactor: don't expose standard responses tied to status codes
* Feat: update empty response description + make it const
* Chore: update snapshot with new response descriptions
* refactor: add an hoursBack query param to the raw metrics endpoint
* refactor: explicitly return undefined
* refactor: make parseHoursBackQueryParam non-static
* refactor: add test for hoursBack query param
* refactor: improve arg name
* refactor: add a 1 hour test case
Adds a new way of handling usage metrics where we push it directly to the database and performs aggregation on the fly. All metrics are aggregated in to buckets of hours. We will for now store metrics for the 48 hours with the following dimensions:
- featureName
- projectName
- envrionment
- yes (the actual count)
- no (the actual count)