Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,6 @@ indent_size = 4

[{Makefile,*.mk,go.mod,go.sum,*.go,.gitmodules}]
indent_style = tab

[*.jmx]
indent_style = unset
3 changes: 1 addition & 2 deletions .github/actions/check-english-usage/action.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,4 @@ runs:
- name: "Check English usage"
shell: bash
run: |
export BRANCH_NAME=origin/${{ github.event.repository.default_branch }}
check=branch ./scripts/githooks/check-english-usage.sh
check=all ./scripts/githooks/check-english-usage.sh
3 changes: 1 addition & 2 deletions .github/actions/check-file-format/action.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,4 @@ runs:
- name: "Check file format"
shell: bash
run: |
export BRANCH_NAME=origin/${{ github.event.repository.default_branch }}
check=branch ./scripts/githooks/check-file-format.sh
check=all ./scripts/githooks/check-file-format.sh
3 changes: 1 addition & 2 deletions .github/actions/check-markdown-format/action.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,4 @@ runs:
- name: "Check Markdown format"
shell: bash
run: |
export BRANCH_NAME=origin/${{ github.event.repository.default_branch }}
check=branch ./scripts/githooks/check-markdown-format.sh
check=all ./scripts/githooks/check-markdown-format.sh
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ We will increase appointment attendance and reduce the number of 'did not attend

### GitHub

- As per NHS guidelines, make your GitHub email private by going [here](https://github.com/settings/emails). There is a checkbox named "Keep my email addresses private". Note down your private email from this setting.
- As per NHS guidelines, make your GitHub email private by going to [GitHub settings](https://github.com/settings/emails). There is a checkbox named "Keep my email addresses private". Note down your private email from this setting.
- Follow these [instructions](https://nhsd-confluence.digital.nhs.uk/display/Vacc/Developer+setup%3A+Github).
- Remember to use your private email, noted above, in GitHub config 'user.email'.
- When on the step to create personal access tokens, remember to also tick 'workflow'. This will allow developers to update workflows
Expand All @@ -35,13 +35,15 @@ From NHS repository template:
On macOS, you will need Homebrew installed, then to install make, like so:

```shell
brew install make gnu-sed gawk coreutils binutils jmeter
brew install make gnu-sed gawk coreutils binutils editorconfig-checker jmeter
```

Find out which homebrew path you have by using this command:

```shell
brew --prefix make
```

Based on the beginning of the path returned, select which HOMEBREW_PATH to export.
- Override default OSX tools with their GNU equivalents
- On M1 Macs and up:
Expand Down Expand Up @@ -87,7 +89,7 @@ From NHS repository template:

- **Colima** - or any equivalent Docker container runtime, e.g. [Rancher Desktop](https://rancherdesktop.io/), etc.

- **Act** - tool to run GitHub actions locally. Usage guide is available [here](https://nektosact.com/usage/index.html)
- **Act** - tool to run GitHub actions locally. Usage guide is available in this [article](https://nektosact.com/usage/index.html)

```shell
brew install act
Expand Down Expand Up @@ -258,7 +260,7 @@ make githooks-run

### Deploy your local changes to AWS dev environment

A detailed description of our infrastructure is outlined [here](infrastructure/README.md).
A detailed description of our infrastructure is outlined in this [file](infrastructure/README.md).

We use Terraform workspaces to distinguish each developer.
So make sure you use your own unique combination of initials, to set the workspace.
Expand Down Expand Up @@ -407,7 +409,7 @@ Our release strategy is based on Semantic Versioning and utilizes tagged commits
- A new tagged release in the "Releases" section of the GitHub repository.
- The corresponding build artifact within the `/tags` folder of the GitHub AWS S3 bucket.

**The branching and tagging strategy to fix broken deployed releases can be found [here](https://nhsd-confluence.digital.nhs.uk/spaces/Vacc/pages/989220238/Branching+and+release+strategy#Branchingandreleasestrategy-Fixingdeployedbrokenreleases).**
**The branching and tagging strategy to fix broken deployed releases can be found in this [page](https://nhsd-confluence.digital.nhs.uk/spaces/Vacc/pages/989220238/Branching+and+release+strategy#Branchingandreleasestrategy-Fixingdeployedbrokenreleases).**

## Design

Expand Down
5 changes: 2 additions & 3 deletions contract/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,15 @@ In the MyVaccines codebase there are 2 files for contract testing with EliD:
### 1. eligibility-api.contract.ts

- Run against a real EliD environment, set by the environment variable "ELIGIBILITY_API_ENDPOINT" as follows:
- Github Actions: the value is defined in the workflow/action .yaml file, passed in as a env var to the contract-test action
- GitHub Actions: the value is defined in the workflow/action .yaml file, passed in as a env var to the contract-test action
- Local: ELIGIBILITY_API_ENDPOINT is set in env.local
- Assertions verify that in the EliD response:
- Success cases: For a given NHS number
- the eligibilityStatus has the expected value (hardcoded in the test assertion as an expectation)
- the eligibilityStatus has the expected value (hard-coded in the test assertion as an expectation)
- CohortElement is present
- Failure cases:
- that for each NHS number expected to fail, a 'LOADING_ERROR' response gets returned


### 2. fetch-eligibility-content.contract.ts

- **This does not call a real environment**
Expand Down
2 changes: 1 addition & 1 deletion docs/NHSLOGIN.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,5 +35,5 @@ After successful set up, you will receive an email where you will get access to

### Setup SSO Connection between fake client and VitA app

Reach out to NHS Login Support in Slack, and request the sso connection from Fake Client to VitA app and provide Client ID of each.
Reach out to NHS Login Support in Slack, and request the SSO connection from Fake Client to VitA app and provide Client ID of each.
This will have to be done for each SSO Client we create in the different environments.
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ Implementation of this compliance check (like text encoding, line endings, tabs

Other linting tools like for example [Prettier](https://prettier.io/) and [ESLint](https://eslint.org/) are not considered here as they are code formatting tools dedicated to specific technologies and languages. The main drivers for this decision are the style consistency across all files in the codebase and to eliminate any disruptive changes introduced based on preferences. EditorConfig rules are recognised and supported by most if not all major editors and IDEs.

Here is the recommended ruleset:
Here is the recommended rule set:

```console
charset = utf-8
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# ADR-003: Acceptable use of GitHub authentication and authorisation mechanisms

>| | |
>| ------------ | --- |
>| Date | `04/09/2023` |
>| Status | `RFC` |
>| Deciders | `Engineering` |
>| Significance | `Construction techniques` |
>| | |
>| ------------ |----------------------------------------------|
>| Date | `04/09/2023` |
>| Status | `RFC` |
>| Deciders | `Engineering` |
>| Significance | `Construction techniques` |
>| Owners | `Amaan Ibn-Nasar, Jacob Gill, Dan Stefaniuk` |

---
Expand Down
12 changes: 6 additions & 6 deletions docs/adr/ADR-004_Content_caching_architecture.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# ADR-004: Content caching architecture

>| | |
>| ------------ |------------------------------------------------------|
>| Date | `22/04/2025` |
>| Status | `Accepted` |
>| Deciders | `Engineering, Architecture` |
>| | |
>| ------------ |--------------------------------------------------------|
>| Date | `22/04/2025` |
>| Status | `Accepted` |
>| Deciders | `Engineering, Architecture` |
>| Significance | `Structure, Nonfunctional characteristics, Interfaces` |
>| Owners | Ankur Jain, Elena Oanea |
>| Owners | Ankur Jain, Elena Oanea |

---

Expand Down
2 changes: 1 addition & 1 deletion docs/adr/ADR-005_NHS_Login_OIDC_Flow_Library.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ for selecting one of these libraries.

## Decision

We have decided to adopt the next-auth library (also known as [authjs](https://authjs.dev/)). This library offers a clean
We have decided to adopt the next-auth library (also known as [Auth.js](https://authjs.dev/)). This library offers a clean
and concise approach to configuring authentication providers and provides flexibility for future use cases. Furthermore,
next-auth manages sessions and session lifecycle, eliminating the need for a separate session management library.
Notably, next-auth leverages the well-regarded [openid-client](https://github.com/panva/openid-client) library internally.
Expand Down
12 changes: 6 additions & 6 deletions docs/adr/ADR-XXX_Agree_CICD_pipeline_structure.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# ADR-XXX: Agree CI/CD pipeline structure

>| | |
>| ------------ | --- |
>| Date | `dd/mm/YYYY` _when the decision was last updated_ |
>| Status | `RFC by dd/mm/YYYY, Proposed, In Discussion, Pending Approval, Withdrawn, Rejected, Accepted, Deprecated, ..., Superseded by ADR-XXX or Supersedes ADR-XXX` |
>| | |
>| ------------ |------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
>| Date | `dd/mm/YYYY` _when the decision was last updated_ |
>| Status | `RFC by dd/mm/YYYY, Proposed, In Discussion, Pending Approval, Withdrawn, Rejected, Accepted, Deprecated, ..., Superseded by ADR-XXX or Supersedes ADR-XXX` |
>| Deciders | `Tech Radar, Engineering, Architecture, Solution Assurance, Clinical Assurance, Technical Review and Governance, Information Governance, Cyber Security, Live Services Board,` ... |
>| Significance | `Structure, Nonfunctional characteristics, Dependencies, Interfaces, Construction techniques,` ... |
>| Owners | |
>| Significance | `Structure, Nonfunctional characteristics, Dependencies, Interfaces, Construction techniques,` ... |
>| Owners | |

---

Expand Down
4 changes: 2 additions & 2 deletions docs/developer-guides/Scripting_Docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -154,8 +154,8 @@ It is usually the case that there is a specific image that you will most often w

```make
build: # Build the project artefact @Pipeline
DOCKER_IMAGE=my-shiny-app
make docker-build
DOCKER_IMAGE=my-shiny-app
make docker-build
```

Now when you run `make build`, it will do the right thing. Keeping this convention consistent across projects means that new starters can be on-boarded quickly, without needing to learn a new set of conventions each time.
Expand Down
2 changes: 1 addition & 1 deletion docs/developer-guides/Scripting_Terraform.md
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ To create your `test` environment, you run the same commands with `test` where p
TF_ENV=test AWS_PROFILE=my-test-environment make terraform-apply opts="-auto-approve"
```

To use the same `terraform` files in a GitHub action, see the docs [here](https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-amazon-web-services).
To use the same `terraform` files in a GitHub action, see the [docs](https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-amazon-web-services).

### Your stack implementation

Expand Down
14 changes: 7 additions & 7 deletions infrastructure/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ Secrets need to be created in AWS Secrets Manager as follows:

Update the values for the following secrets after generating them: -

- /vita/apim/prod-1.pem - APIM private key used to sign JWTs to access user-restricted APIs via APIM, generated from [here](https://digital.nhs.uk/developer/guides-and-documentation/security-and-authorisation/user-restricted-restful-apis-nhs-login-separate-authentication-and-authorisation#step-3-generate-a-key-pair). 'prod-1' here is the key id used during generation.
- /vita/apim/prod-1.pem - APIM private key used to sign JWTs to access user-restricted APIs via APIM, following this [guide](https://digital.nhs.uk/developer/guides-and-documentation/security-and-authorisation/user-restricted-restful-apis-nhs-login-separate-authentication-and-authorisation#step-3-generate-a-key-pair). 'prod-1' here is the key id used during generation.
- /vita/apim/prod-1.json - APIM public key in JWKS format generated above
- /vita/nhslogin/private_key.pem - NHS Login private key generated from [here](https://nhsconnect.github.io/nhslogin/generating-pem/)
- /vita/nhslogin/private_key.pem - NHS Login private key generated from this [guide](https://nhsconnect.github.io/nhslogin/generating-pem/)
- /vita/nhslogin/public_key.pem - NHS Login public key generated above
- /vita/splunk/hec/endpoint - HEC endpoint of Splunk
- /vita/splunk/hec/token - HEC token of Splunk endpoint to store operational logs
Expand Down Expand Up @@ -50,11 +50,11 @@ Manually create the following error routes. The current library we use doesn't u
- HTTP Response code: 500
- Repeat the previous step for all other 5xx codes.
- Click "Create custom error response" button
- HTTP error code: 403
- Error caching minimum TTL: 300
- Customise error response: yes
- Response page path: /assets/static/service-failure.html
- HTTP Response code: 403
- HTTP error code: 403
- Error caching minimum TTL: 300
- Customise error response: yes
- Response page path: /assets/static/service-failure.html
- HTTP Response code: 403

### Setting default limits and settings

Expand Down
50 changes: 25 additions & 25 deletions infrastructure/modules/deploy_splunk/files/lambda.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,26 +8,26 @@
Cloudwatch Logs sends to Firehose records that look like this:

{
"messageType": "DATA_MESSAGE",
"owner": "123456789012",
"logGroup": "log_group_name",
"logStream": "log_stream_name",
"subscriptionFilters": [
"messageType": "DATA_MESSAGE",
"owner": "123456789012",
"logGroup": "log_group_name",
"logStream": "log_stream_name",
"subscriptionFilters": [
"subscription_filter_name"
],
"logEvents": [
],
"logEvents": [
{
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208016,
"message": "log message 1"
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208016,
"message": "log message 1"
},
{
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208017,
"message": "log message 2"
"id": "01234567890123456789012345678901234567890123456789012345",
"timestamp": 1510109208017,
"message": "log message 2"
}
...
]
]
}

The data is additionally compressed with GZIP.
Expand All @@ -37,22 +37,22 @@
1) Gunzip the data
2) Parse the json
3) Set the result to ProcessingFailed for any record whose messageType is not DATA_MESSAGE, thus redirecting them to the
processing error output. Such records do not contain any log events. You can modify the code to set the result to
Dropped instead to get rid of these records completely.
processing error output. Such records do not contain any log events. You can modify the code to set the result to
Dropped instead to get rid of these records completely.
4) For records whose messageType is DATA_MESSAGE, extract the individual log events from the logEvents field, and pass
each one to the transformLogEvent method. You can modify the transformLogEvent method to perform custom
transformations on the log events.
each one to the transformLogEvent method. You can modify the transformLogEvent method to perform custom
transformations on the log events.
5) Concatenate the result from (4) together and set the result as the data of the record returned to Firehose. Note that
this step will not add any delimiters. Delimiters should be appended by the logic within the transformLogEvent
method.
this step will not add any delimiters. Delimiters should be appended by the logic within the transformLogEvent
method.
6) Any individual record exceeding 6,000,000 bytes in size after decompression, processing and base64-encoding is marked
as Dropped, and the original record is split into two and re-ingested back into Firehose or Kinesis. The re-ingested
records should be about half the size compared to the original, and should fit within the size limit the second time
round.
as Dropped, and the original record is split into two and re-ingested back into Firehose or Kinesis. The re-ingested
records should be about half the size compared to the original, and should fit within the size limit the second time
round.
7) When the total data size (i.e. the sum over multiple records) after decompression, processing and base64-encoding
exceeds 6,000,000 bytes, any additional records are re-ingested back into Firehose or Kinesis.
exceeds 6,000,000 bytes, any additional records are re-ingested back into Firehose or Kinesis.
8) The retry count for intermittent failures during re-ingestion is set 20 attempts. If you wish to retry fewer number
of times for intermittent failures you can lower this value.
of times for intermittent failures you can lower this value.
"""

import base64
Expand Down
Loading
Loading