You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: site/call.md
+19-18Lines changed: 19 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,35 +7,42 @@ title: Call for Participation
7
7
8
8
| Date | Event |
9
9
|-|-|
10
-
| January 1st, 2026 – February 27th, 2026 | Rolling Abstract Period |
10
+
| January 9th, 2026 – February 27th, 2026 | Rolling Abstract Period |
11
11
| March 21st, 2026 | Full Paper Deadline |
12
12
| March 27th, 2026 | Full Paper Notification |
13
13
| April 2nd, 2026 | Camera-ready |
14
14
| July 6th, 2026 | On-site Competition Event at FSE/AIWare 2026 |
15
15
16
-
## Background and Motivation
16
+
## Brief
17
17
18
-
It is now common for developers to adopt libraries that aid their software development. Libraries provide pre‑tested, high‑quality functionality that can be easily integrated. Evidence of this trend is reflected in the vast number of libraries available on GitHub. For example, Python developers often import libraries such as NumPy for statistical computations and ggplot for visualizations.
18
+
It is now common for developers to adopt libraries that aid their software development. Libraries provide pre‑tested, high‑quality functionality that can be easily integrated.
19
+
These libraries frequently depend on other libraries, creating a complex web of interdependencies—a hallmark of the software ecosystem.
19
20
20
-
These libraries frequently depend on other libraries, creating a complex web of interdependencies—a hallmark of the software ecosystem. The primary problems associated with libraries are bugs and, more importantly, security vulnerabilities. Studies have shown that outdated libraries pose significant risks to attackers, giving rise to initiatives such as the Global Software Supply Chain (GSSC) to protect libraries from supply‑chain attacks.
21
+
The primary problems associated with libraries are bugs and, more importantly, security vulnerabilities. Studies have shown that outdated libraries pose significant risks to attackers, giving rise to initiatives such as the Global Software Supply Chain (GSSC) to protect libraries from supply‑chain attacks.
21
22
22
23
With the rise of generative AI in the developer workflow, tools like ChatGPT, Gemini and others are increasingly used to assist with daily tasks. However, using pre‑trained models presents challenges related to contextual understanding. Even agentic models often produce numerous errors during generation, potentially incurring significant time and monetary costs for developers.
23
24
24
-
Bringing together the Mining Software Repositories, Program Comprehension, and AIware software engineering communities, we invite studentsand researchers to put their knowledge of new GenAI techniques to generate software libraries.
25
+
Bringing together the Mining Software Repositories, Program Comprehension, and AIware software engineering communities, we invite students, researchers and industry to put their knowledge of new GenAI techniques to generate software libraries.
25
26
26
27
## Competition Format
27
28
28
29
### Task
29
30
30
-
The competition challenges participants to design and present different GenAI techniques capable of generating software libraries. The goal is to evaluate how current AI models and workflows perform when asked to reproduce or adapt the behavior, structure, or interface of an existing library. Participants are asked to complete the following task:
31
+
The competition challenges participants to design and present different GenAI techniques capable of generating software libraries. The scientific contribution is to evaluate how current AI models and workflows perform when asked to reproduce or adapt the behavior, structure, or interface of an existing library.
31
32
32
-
> *Generate a library software repository that is functionality similar or better to the human-written repository. Demonstrate that this approach can be used for different programming languages.*
33
+
In detail, participants are asked to complete the following task:
33
34
34
-
Note that the measurement of functionality or better can be proposed by the competitors. During the competition, participants must reproduce the results as a demonstration.
35
+
> *Generate a library software repository or library that is functionality similar or better than the existing human-written repository.*
36
+
37
+
Different to benchmarks and from a software engineering perspective, teams should demonstrate reproducibility, practical application and how these techniques are possible in a live demonstration. Novelty must be emphasized in the submission abstract and all scripts and methods will be made available.
38
+
39
+
Participants may use any GenAI or strategies they prefer—fine-tuning, prompting strategies, program synthesis, RAG pipelines, multi-agent architectures, or any hybrid approach. The competition encourages creativity and experimentation.
40
+
41
+
Participants will develop demos for an end-to-end generation, testing, and maintenance of these AI-generated libraries, and to quantitatively compare their performance and quality against human-written counterparts.
35
42
36
43
### Dataset
37
44
38
-
We will provide a listing of all package libraries from seven ecosystems (Crates, Go, Maven, NPM, PHP, PyPI, RubyGems). Participants are also encouraged to bring their own domain-specific data. Participants may extend or refine the evaluation, but the official scripts will be used during final judging.
45
+
We will provide a listing of all package libraries from seven ecosystems (Crates, Go, Maven, NPM, PHP, PyPI, RubyGems). Participants are also encouraged to bring their own domain-specific data. Participants may extend or refine the evaluation.
39
46
40
47
| Ecosystem | Packages | Source |
41
48
|-|-|-|
@@ -48,15 +55,9 @@ We will provide a listing of all package libraries from seven ecosystems (Crates
48
55
| RubyGems | 188,204 |https://rubygems.org/names|
49
56
|**Total**| 8,112,179 | -- |
50
57
51
-
### Tools and Techniques
52
-
53
-
Participants may use any GenAI or strategies they prefer—fine-tuning, prompting strategies, program synthesis, RAG pipelines, multi-agent architectures, or any hybrid approach. The competition encourages creativity and experimentation.
54
-
55
-
Participants will develop demos for and end-to-end generation, testing, and maintenance of these AI-generated libraries, and to quantitatively compare their performance and quality against human-written counterparts.
56
-
57
-
### Submission Requirements
58
+
### Abstract and Video Submission Requirements
58
59
59
-
Each team must submit a short abstract and a mandatory video (up to 5 minutes) demonstrating their technique. The video should show the key techniques used and the end-to-end generation. Optional links to repositories or demos are welcome.
60
+
Each team must submit a short abstract and a mandatory video (up to 5 minutes) demonstrating their technique. The video must show novelty, key techniques used and the end-to-end generation. Optional links to repositories or demos are welcome.
60
61
61
62
### On site event
62
63
@@ -66,4 +67,4 @@ Teams are encouraged to use their own resources (local or cloud), but for inclus
66
67
67
68
### Judging and Awards
68
69
69
-
Judges from the program committee will evaluate submissions based on novelty, feasibility of execution, similarity to human-written library and presentation. The assessment will consider both the submitted materials and the on-site performance. A combination of committee judgement and on-site voting may be used to recognise excellence. The best contributions will receive a certificate from the competition chairs.
70
+
Judges from the program committee will evaluate submissions based on novelty, feasibility of execution, similarity to human-written library and presentation. The assessment will consider both the submitted materials and the on-site performance. A combination of committee judgement and on-site voting may be used to recognise excellence. The best contributions will receive a certificate from the competition chairs.
This competition aims to address challenges arising from software ecosystem dependencies by introducing a novel approach: On-Demand Library Generation (ODAI-LIB). Instead of relying on vast, complex, and potentially vulnerable external repositories, contestants will focus on dynamically generating small, context-specific, and high-quality libraries precisely when they are needed within a project. The participants will leverage prior experience in software ecosystems and have already compiled an initial dataset of software libraries and performed preliminary comparative analyses of human-generated versus AI-generated code.
11
+
This competition aims to address challenges arising from software ecosystem dependencies by introducing a novel approach: On-Demand Library Generation (ODAI-LIB).
12
12
13
-
{{ call_page.content }}
13
+
Contestants will focus on dynamically generating high-quality libraries precisely when they are needed within a project. The participants will leverage an initial dataset of software libraries. Rather than benchmarking, participants will demonstrate practical usage in a real-world scenario using real data provided.
0 commit comments