Skip to content

Commit dd67dd3

Browse files
authored
Merge pull request #8 from maytaldahan/main
Added section headers and fixed minor navigation consistencies
2 parents 8d76a53 + 4a8ece9 commit dd67dd3

10 files changed

Lines changed: 58 additions & 53 deletions

File tree

tutorials/Fine_Tuning_Vision/01-intro-fine-tuning.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
## Fine-Tuning Vision Models Using Ultralytics and Tapis
1+
## Section 8: Fine-Tuning Vision Models Using Ultralytics and Tapis
22

33
This Tapis application allows users to fine-tune Vision models such as YOLO using Ultralytics running in a Singularity container on Vista. It is designed to run on High-Performance Computing (HPC) systems via Tapis, leveraging GPU acceleration for training tasks.
44

@@ -93,12 +93,12 @@ These parameters define the hardware footprint requested from the Slurm schedule
9393

9494
Follow these steps to submit the job using the Tapis UI:
9595

96-
### Step 1: Initiate Submission
96+
### Step 8.1: Initiate Submission
9797
Navigate to the **Apps** list and select the `ultralytics-fine-tune` app. Click the button to initiate a JSON-based submission.
9898

9999
![Step 1 - Selecting the Ultralytics App in UI](/tutorials/images/Step1-Ultralyticsapp.png)
100100

101-
### Step 2: Edit the JSON Payload
101+
### Step 8.2: Edit the JSON Payload
102102
Paste the job JSON provided below into the editor and click `Submit`.
103103

104104
<div style="max-height:400px; overflow:auto; border:1px solid #ddd; padding:10px;">
@@ -148,21 +148,21 @@ These are specific flags passed to the Slurm scheduler on the Vista system:
148148

149149
![Step 2 - Pasting the Job JSON into the Editor and Submit](/tutorials/images/Fine-tune-job.png)
150150

151-
### Step 3: Monitor Job Progress
151+
### Step 8.3: Monitor Job Progress
152152
After clicking **Submit**, navigate to the **Jobs** tab. You can monitor the status as it moves from `PENDING` to `RUNNING` and finally `FINISHED`.
153153

154154
![Step 3 - Monitoring Job Status in the Dashboard](/tutorials/images/fine-tune-job-running.png)
155155
Since we are running 100 Epochs to fine-tune, this will take around 10-12 minutes to finish.
156156

157157

158-
### Step 4: Job Output and Results location
158+
### Step 8.4: Job Output and Results location
159159
Once the job finishes, you should see output similar to the image below.
160160
![Step 4 - After Job completes](/tutorials/images/fine-tune-job-completion.png)
161161

162162
You should see a train directory with weights. Inside the `weights` directory, you can access the `best.pt model`. This file will be accessible in Jupyter Notebook.
163163

164164

165-
### Step 5: Finding best.pt file from Jupyter
165+
### Step 8.5: Finding best.pt file from Jupyter
166166

167167
Inside your work directory, you should see a `vista` folder. In there you can find the directory named with your `jobUUID`. In the above image the job uuid is highlighted. In the job directory you will find the outputs of the job archived with the same train directory containing the best.pt file.
168168
`$WORK -> vista -> jobUUID -> train -> weights -> best.pt`

tutorials/Fine_Tuning_Vision/02-evaluation.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,22 @@
1-
## Evaluating the Fine-tuned Model
1+
## Section 9: Evaluating the Fine-tuned Model
22

33
To try this part of the tutorial, we will use the fine-tuned model `best.pt` created by the ultralytics-fine-tune job.
44

5-
Your fine-tuned model will be accessible through Jupyter in this location
5+
### Step 9.1: Locate your Fine-tuned model
6+
7+
Once your job is complete, your fine-tuned model will be accessible through Jupyter in this location
68
`$WORK -> vista -> jobUUID -> train -> weights -> best.pt`
79

10+
Please locate your file location.
11+
12+
### Step 9.2: Point your Jupyter notebook code at your Fine-tuned model location
13+
814
You should already have the generated code that we tested accuracy on the baseline model.
915

1016
Now, just change the path of your model to the fine-tuned model path and re-run the code to test the accuracy.
1117

18+
### Step 9.3: Re-run the notebook
19+
1220
When the code runs on the test data, you should see improved accuracy
1321

1422
```

tutorials/Intro_Tapis/01-intro-to-tapis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
## AI, The National CI Ecosystem, and the Tapis API Platform
1+
## Section 1: AI, The National CI Ecosystem, and the Tapis API Platform
22

33
[Lecture Slides](https://docs.google.com/presentation/d/1BVLnUbyiWjsaS33zMshW3TXqtfvv6zGklaCNBeX7Go0/edit?slide=id.g3cd6a51b6a2_0_21#slide=id.g3cd6a51b6a2_0_21)
44

tutorials/Intro_Tapis/02-initial-tapis-ui.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,9 @@ img {
66
</style>
77

88

9-
# Initial Steps with TapisUI
9+
# Section 2: Initial Steps with TapisUI
1010

11-
## Logging In
11+
## Step 2.1: Logging In
1212

1313
Login to [https://public.tapis.io](https://public.tapis.io). Visit the site and press the "Proceed to login" area.
1414

@@ -25,7 +25,7 @@ You will then be prompted to log-in with username, password, and MFA token.
2525
<img src="/tutorials/assets/demo/login-sso.png" style="max-width:50%;">
2626

2727

28-
### Inspect JWT
28+
### Step 2.2: Inspect JWT
2929

3030
Once logged in, you can inspect your JSON Web Token (JWT). The JWT is the authentication token that Tapis uses to verify your identity across all API calls. It contains encoded claims about your user session — including your username, tenant, and token expiration. You can view your token in TapisUI to understand how Tapis manages identity and access.
3131

@@ -37,7 +37,7 @@ A Tapis **system** represents a storage or execution resource — such as a VM o
3737

3838
In this tutorial, instead of creating systems from scratch, we will use a **public system** that has been pre-registered for you on TACC's Vista cluster.
3939

40-
### Navigate the Systems List
40+
### Step 2.3: Navigate the Systems List
4141

4242
When you log in to TapisUI and click on **Systems** from the left-hand menu, you should see one public system available to you. This system has been pre-registered for the tutorial.
4343

@@ -46,7 +46,7 @@ When you log in to TapisUI and click on **Systems** from the left-hand menu, you
4646
{: .note}
4747
> ⚠️ Note: The system is visible but you are **not yet authenticated** to access files or run jobs on it. You must first add TMS credentials.
4848
49-
### Add TMS Credentials for the NAIRR Vista Public System
49+
### Step 2.4: Add TMS Credentials for the NAIRR Vista Public System
5050

5151
To access the public system running on Vista, you need to add **TMS (Trust Management System)** credentials. TMS credentials are temporary credentials generated by the TMS service and stored in the Tapis Security Kernel (SK). They allow Tapis services and applications to securely access external resources on your behalf. Instead of storing permanent usernames or passwords, Tapis retrieves the required credentials from TMS at runtime — improving security by keeping sensitive information encrypted and centrally managed.
5252

@@ -56,7 +56,7 @@ Click on **Authenticate with TMS Keys** on the Vista system.
5656

5757
Once your TMS credentials are added, your system is ready to use.
5858

59-
### Use "Go to Home" to Get to Files
59+
### Step 2.5: Use "Go to Home" to Get to Files
6060

6161
After authenticating, you can verify your system access by clicking the **View Files** button, which will take you to the file listing on the Vista system.
6262

@@ -68,13 +68,13 @@ After authenticating, you can verify your system access by clicking the **View F
6868

6969
The Tapis Files service provides a unified interface for managing files across any registered Tapis system. Once you have authenticated to a system, you can browse, upload, download, and manage files directly from TapisUI.
7070

71-
### Navigate the Files in Vista System
71+
### Step 2.6: Navigate the Files in Vista System
7272

7373
From the left-hand menu, click on the **Files** tab and select the Vista system. You should see a listing of files on the system. You can navigate directories, view file details, and open files directly.
7474

7575
![System View Files](/tutorials/images/System_View_Files.png)
7676

77-
### Test Uploading a File
77+
### Step 2.7: Test Uploading a File
7878

7979
Try uploading a file to the Vista system using the upload functionality in the Files tab. This confirms that your credentials are working and that you have write access to the system.
8080

@@ -86,11 +86,11 @@ A Tapis **application** represents all the information required to run a Tapis j
8686

8787
In this tutorial, instead of creating an application from scratch, we will use the **FlexServ** public application that has already been registered for you.
8888

89-
### Navigate to the Tapis Apps List
89+
### Step 2.8: Navigate to the Tapis Apps List
9090

9191
Click on **Apps** from the left-hand menu in TapisUI. You will see a list of available applications. Toggle between "My Apps" and "Public Apps" to see the full list.
9292

93-
### Find the FlexServ Public App
93+
### Step 2.9: Find the FlexServ Public App
9494

9595
Look for the application **FlexServ-vista-nairr version 1.4.0** in the public apps list. FlexServ is a TACC-owned inference server for running AI models on HPC systems. It has been pre-registered as a public app for all users to submit jobs.
9696

tutorials/Jupyter-Notebook/intro-to-jupyter.md

Lines changed: 3 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,13 @@
1-
Introduction to the Jupyter Notebook Environment
2-
===================================================
3-
1+
## Section 6: Launching the Jupyter Notebook Environment
42

53
Jupyter is an open source project that provides a webapp interface for writing code and documents. Throughout this tutorial, we will be using a Jupyter Notebook environment for making Tapis User Requests.
64

7-
### Download Notebook
8-
9-
Dowload the Jupyter Notebook by clicking the link below:
10-
[Download Notebook](sentiment_analysis.ipynb)
11-
12-
### Starting up your Jupyter Notebook Environment
5+
### Step 6.1: Starting up your Jupyter Notebook Environment
136

147
For this tutorial, we will use [TACC's Public JupyterHub](https://public.jupyter.tacc.cloud)
158
You may login with your TACC accounts.
169

17-
18-
### Navigating to the $WORK File System
10+
### Step 6.2: Navigating to the $WORK File System
1911

2012
On successful login, ensure that you have access to a folder, `work`, within the Jupyter file system.
2113

tutorials/Tapis_FlexServ/01-lecture.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
## Classes of AI Models, Commercial Registries and Flexserv
1+
## Section 3: Classes of AI Models, Commercial Registries and Flexserv
22

33
[Lecture Slides](https://docs.google.com/presentation/d/1BVLnUbyiWjsaS33zMshW3TXqtfvv6zGklaCNBeX7Go0/edit?slide=id.g3cd6a51b6a2_0_26#slide=id.g3cd6a51b6a2_0_26)
44

tutorials/Tapis_FlexServ/01b-running-flexserv.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,6 @@
1-
## Adding TMS Credentials on the Vista system.
1+
## Section 4: Executing Large Models on Vista with Tapis and Flexserv
2+
3+
### Step 4.1: Adding TMS Credentials on the Vista system.
24

35
To access the public system running on Vista, first you will need to add TMS credentials on the system. TMS (Trust Management System) credentials on Tapis systems are temporary credentials generated by the TMS System and stored in the Tapis Security Kernel (SK) that allow services or applications to securely access external resources on behalf of a user. Instead of storing permanent usernames or passwords, Tapis retrieves the required credentials from the TMS service at runtime. This approach improves security by keeping sensitive information encrypted and centrally managed while enabling automated job execution on Tapis systems.
46

@@ -14,7 +16,7 @@ You can now view files on Vista, by clicking on the `View Files ` button.
1416
![View Files](/tutorials/images/ViewFiles.png)
1517

1618

17-
## Running FlexServ Application on Vista
19+
### Step 4.2: Running FlexServ Application on Vista
1820

1921
The following app runs the FlexServ on TACC's Vista System. For the purposes of this tutorial, the application has already been registered with Tapis and is available as a public app for all users to submit jobs.
2022

@@ -242,7 +244,7 @@ The following app runs the FlexServ on TACC's Vista System. For the purposes of
242244
You should see the Flex Server application already registered in your Tapis UI: **FlexServ-vista-nairr version 1.4.0**
243245
![FlexServ Application](/tutorials/images/Flexserv_app.png)
244246

245-
## Submit FlexServ Job using TAPIS UI
247+
### Step 4.3: Submit FlexServ Job using TAPIS UI
246248

247249
**1. Initiate Submission**
248250

@@ -370,13 +372,12 @@ Step 5b) Once the `tapisjob.out` opens, look at the ACCESS INFORMATION Section t
370372
![Step 5b: Vista_url_token](/tutorials/images/GetFlexServerPortToken.png)
371373

372374

373-
### What's next?
375+
### Step 4.4: Using the FlexServ UI
374376

375377
If you made it this far, you are successfully running FlexServ, you can explore the FlexServ UI next and try to send your first chat.
376378

377-
### Using the FlexServ UI
378-
Go to the URL hhtps://vista.tacc.utexas.edu:`Port number from above` and enter the TAP token from the tapisjob.out as shown in figure below.
379+
Go to the URL `htps://vista.tacc.utexas.edu`:`Port number from above` and enter the TAP token from the tapisjob.out as shown in figure below.
379380

380381
![FlexServ UI](/tutorials/images/FlexservUI.png)
381382

382-
Next, we will use the flexserv to generate a code for small animal detection using vision models and evaluate the model performance.
383+
Next, we will use the flexserv to generate a code for small animal detection using vision models and evaluate the model performance.

tutorials/Tapis_FlexServ/01c-code-gen-flexserv.md

Lines changed: 13 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,23 @@
1-
## Prompt Engineering and Generating Image Detection Code
1+
## Section 7: Prompt Engineering and Generating Image Detection Code
22

33
[Lecture Slides](https://docs.google.com/presentation/d/1BVLnUbyiWjsaS33zMshW3TXqtfvv6zGklaCNBeX7Go0/edit?slide=id.g3cdba15a02d_6_191#slide=id.g3cdba15a02d_6_191)
44

5-
65
### FlexServ API Prompt: YOLO Evaluation Script Generator
76

87
#### Task Summary:
98
To test the capabilities of the FlexServ inference server, we can provide a complex prompt to the Responses API. This prompt asks the AI to generate a complete Python evaluation script that performs Animal detection on the images from the LILA BC Small Animal dataset. This is a large camera-trap image dataset used for wildlife monitoring and ecological research. It contains millions of images captured by automated cameras, including small mammals and many blank triggers, along with annotations describing the detected species. For training object detection models such as YOLO, the dataset can be downloaded in YOLO format, where each image has a corresponding .txt label file containing bounding-box coordinates in the form <class_id> <x_center> <y_center> <width> <height>.
109

1110
---
1211

13-
#### On FlexServ UI
12+
### Exploring the FlexServ UI
13+
14+
### Step 7.1: Refresh Model Pool
1415

1516
- Refresh the Model pool so you can see public and private models available for you to run.
1617
![Model Pool](/tutorials/images/Model_pool.png)
1718

19+
### Step 7.2: Update the Responses API and Parameters
20+
1821
- Copy and paste the following prompt into the FlexServ UI in the `Responses API`, `Input(Markdown)` section, shown in the image below.
1922

2023

@@ -72,21 +75,24 @@ After the code, briefly explain how the program works in plain English.
7275
- `Qwen/Qwen2.5-Coder32B-Instruct-61.0 GB - Text Generation`
7376
- Make sure the `Streams` is checked.
7477
- Uncheck `Multi-turn conversation`
78+
79+
### Step 7.3: Run the Responses API
80+
7581
- Click `Run`. Within a few minutes, you should see the code generation start in the blue box in the Responses API. Wait for it to complete. After completion, you should see output similar to the image below.
7682

7783
![Code](/tutorials/images/Code.png)
7884

7985

80-
### On Jupyter:
86+
### Step 7.4: Running Code Detection On Jupyter
8187

82-
Go to the notebook Code-Detection on your Jupyter path.
88+
Go to the Jupyter notebook Code-Detection on your Jupyter path.
8389
`ai-tutorial-2026 -> notebooks -> Code-Detection.ipynb`
8490

8591
Copy the generated code from FlexServ UI in a new cell below the cell titled `Put your generated code here`.
8692

87-
Update the variable `DATASET_ROOT` to path `/home/jovyan/work/vista/ai-tutorial-2026/datasets/AnimalEcology.v4i.yolov11` in your generated code
93+
Update the variable `DATASET_ROOT` to path `/home/jovyan/ai-tutorial-2026/datasets/AnimalEcology.v4i.yolov11` in your generated code
8894

89-
Also update the model path to `/home/jovyan/work/vista/ai-tutorial-2026/models/yolov9t_ep200_bs32_lr0.005_baa22147.pt`
95+
Also update the model path to `/home/jovyan/ai-tutorial-2026/models/yolov9t_ep200_bs32_lr0.005_baa22147.pt`
9096

9197
Now run the code. On successful run, you should see output similar to below
9298

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,5 @@
1-
## Computer Vision Models in Scientific Applications
1+
## Section 5: Computer Vision Models in Scientific Applications
22

33
[Lecture Slides](https://docs.google.com/presentation/d/1BVLnUbyiWjsaS33zMshW3TXqtfvv6zGklaCNBeX7Go0/edit?slide=id.g3cd6a51b6a2_0_31#slide=id.g3cd6a51b6a2_0_31)
44

55

6-
### Summary of Topics
7-
*

tutorials/_config.yml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,14 +5,14 @@ permalink: pretty
55
baseurl: "/tutorials"
66
nav:
77
- AI/ML, The National CI Ecosysystem, and Tapis:
8-
- AI, CI and Tapis: ./Intro_Tapis/01-intro-to-tapis
9-
- Initial Steps with TapisUI: ./Intro_Tapis/02-initial-tapis-ui
8+
- (Lecture) AI, CI and Tapis: ./Intro_Tapis/01-intro-to-tapis
9+
- (Hands-on) Initial Steps with TapisUI: ./Intro_Tapis/02-initial-tapis-ui
1010
- AI Models and Fleserv:
11-
- Classes of AI Models and the AI Ecosysystem: ./Tapis_FlexServ/01-lecture
12-
- Hands-on Flexserv TapisUI Tutorial: ./Tapis_FlexServ/01b-running-flexserv
11+
- (Lecture) Classes of AI Models, 3rd Party Model Registries, Flexserv: ./Tapis_FlexServ/01-lecture
12+
- (Hands-on) Executing Large Models on Vista Via Tapis and Flexserv: ./Tapis_FlexServ/01b-running-flexserv
1313
- Vision Models, Fine-tuning and Jupyter Notebooks:
1414
- Vision Models in Science: ./Vision_Models/01-intro-vision-models
15-
- Intro to Jupyter: ./Jupyter-Notebook/intro-to-jupyter
15+
- (Hands-on) Launching Juptyer Notebook: ./Jupyter-Notebook/intro-to-jupyter
1616
- Prompt Engineering and Detection Code: ./Tapis_FlexServ/01c-code-gen-flexserv
1717
- Fine-tuning with Ultralytics: ./Fine_Tuning_Vision/01-intro-fine-tuning
1818
- Evaluating Fine-tuned Models:

0 commit comments

Comments
 (0)