diff --git a/Readme.md b/Readme.md index 78bfa52e..45f03ee5 100644 --- a/Readme.md +++ b/Readme.md @@ -1,99 +1,184 @@ -# Node.js Fundamentals +# Assignment: Node.js Fundamentals ## Description -This repository contains solutions for Node.js Fundamentals assignment. The assignment covers various Node.js core APIs including File System, CLI, Modules, Hash, Streams, Zlib, Worker Threads, and Child Processes. +Your task is to complete several tasks to learn Node.js core APIs. Each subtask is a standalone exercise in a dedicated file inside the corresponding subfolder of `src/`. -## Getting Started +Fork the starter repository and implement the required functionality. -1. **Fork this repository** - - Click the "Fork" button at the top right of this page: https://github.com/AlreadyBored/node-nodejs-fundamentals +Starter repository: https://github.com/AlreadyBored/node-nodejs-fundamentals -2. **Clone your fork** - - ```bash - git clone https://github.com/YOUR_USERNAME/node-nodejs-fundamentals.git - cd node-nodejs-fundamentals - ``` +## Technical requirements -3. **Install dependencies** (if any added in the future) - - ```bash - npm install - ``` +- Any external tools and libraries are prohibited +- Use 24.x.x version (24.10.0 or upper) of Node.js +- Don't change the signature of pre-written functions (e.g. don't rename them, don't make them synchronous, etc.) +- Prefer asynchronous API whenever possible -4. **Start implementing the tasks** - - Each file in the `src/` directory contains a function template with comments describing what needs to be implemented. - -## Requirements - -- Node.js version: >=24.10.0 -- npm version: >=10.9.2 -- No external libraries allowed - -## Usage +## Subtasks ### File System (src/fs) -- `npm run fs:snapshot` - Create snapshot of workspace directory -- `npm run fs:restore` - Restore directory structure from snapshot -- `npm run fs:findByExt` - Find files by extension in workspace -- `npm run fs:merge` - Merge .txt files from workspace/parts -- `npm run fs:merge -- --files a.txt,b.txt,c.txt` - Merge specific files from workspace/parts in provided order +You should implement several functions in dedicated files: -Snapshot format reminder: +- `snapshot.js` — implement function that recursively scans the `workspace` directory and writes a `snapshot.json` file next to it. The JSON file should contain `rootPath` and a flat `entries` array with file contents: + ```json + { + "rootPath": "/home/user/workspace", + "entries": [ + { "path": "file1.txt", "type": "file", "size": 1024, "content": "file contents as base64 string" }, + { "path": "subdir", "type": "directory" }, + { "path": "subdir/nested.txt", "type": "file", "size": 512, "content": "nested file contents as base64 string" } + ] + } + ``` + `rootPath` is an absolute path to the original `workspace` directory. `entries[].path` values should be relative to `workspace`. Size is in bytes (only for files). File contents should be stored as base64-encoded strings. If `workspace` doesn't exist, `Error` with message `FS operation failed` must be thrown. -```json -{ - "rootPath": "/absolute/path/to/workspace", - "entries": [ - { "path": "file1.txt", "type": "file", "size": 1024, "content": "base64" }, - { "path": "nested", "type": "directory" } - ] -} -``` +- `restore.js` — implement function that reads `snapshot.json` and recreates the directory/file structure described in it inside a `workspace_restored` folder. Directories should be created, files should be recreated with their original content (decoded from base64). `rootPath` from snapshot should be treated as metadata and must not affect restore destination. If `snapshot.json` doesn't exist, `Error` with message `FS operation failed` must be thrown. If `workspace_restored` already exists, `Error` with message `FS operation failed` must be thrown. -`entries[].path` values must be relative to `workspace`. +- `findByExt.js` — implement function that recursively finds all files with a specific extension inside the `workspace` directory and prints their relative paths sorted alphabetically, one per line. The extension is provided as a CLI argument `--ext ` (e.g. `--ext txt` or `--ext js`). If the `--ext` argument is not provided, default to `.txt`. If `workspace` doesn't exist, `Error` with message `FS operation failed` must be thrown. + +- `merge.js` — implement function that concatenates text files and writes the result to `workspace/merged.txt`. + - Default behavior: read all `.txt` files from `workspace/parts` in alphabetical order by filename. + - Optional behavior: if CLI argument `--files ` is provided, merge only those files from `workspace/parts` in the provided order. + - If `--files` is provided, it takes precedence over automatic `.txt` discovery. + - If the `parts` folder doesn't exist, contains no `.txt` files (for default mode), or any requested file from `--files` does not exist, `Error` with message `FS operation failed` must be thrown. ### CLI (src/cli) -- `npm run cli:interactive` - Interactive command-line interface -- `npm run cli:progress` - Display progress bar +You should implement several functions in dedicated files: + +- `interactive.js` — implement a simple interactive command-line interface using the `readline` module. The program should: + - Display a prompt `> ` and wait for user input + - Support the following commands: + - `uptime` — prints process uptime in seconds (e.g. `Uptime: 12.34s`) + - `cwd` — prints the current working directory + - `date` — prints the current date and time in ISO format + - `exit` — prints `Goodbye!` and terminates the process + - On unknown command, print `Unknown command` + - On `Ctrl+C` or end of input, print `Goodbye!` and exit + +- `progress.js` — implement a function that simulates a progress bar in the terminal. The bar should go from 0% to 100%, updating in place (using `\r`). Parameters can be customized via CLI options: + - `--duration ` — total duration in milliseconds (default: 5000) + - `--interval ` — update interval in milliseconds (default: 100) + - `--length ` — progress bar character length (default: 30) + - `--color ` — optional color for the filled part of the progress bar in `#RRGGBB` format (default: no color) + - If `--color` is invalid, render the progress bar without color (do not throw) + - Apply color only to the filled part (`████...`) and reset style with `\x1b[0m` after it + The output format should be: `[████████████████████ ] 67%`. When complete, print `Done!` on a new line. ### Modules (src/modules) -- `npm run modules:dynamic` - Dynamic plugin loading +You should implement a function in a dedicated file: + +- `dynamic.js` — implement a function that accepts a plugin name as a command line argument and dynamically imports the corresponding module from the `plugins/` subdirectory. Each plugin module exports a `run()` function that returns a string. After importing, call `run()` and print the result. Three plugins are pre-created: `uppercase.js`, `reverse.js`, `repeat.js`. If the plugin doesn't exist, print `Plugin not found` and exit with code 1. ### Hash (src/hash) -- `npm run hash:verify` - Verify file checksums using SHA256 +You should implement a function in a dedicated file: + +- `verify.js` — implement function that reads a `checksums.json` file containing an object where keys are filenames and values are expected SHA256 hex hashes: + ```json + { + "file1.txt": "2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824", + "file2.txt": "486ea46224d1bb4fb680f34f7c9ad96a8f24ec88be73ea8e5a6c65260e9cb8a7" + } + ``` + For each file listed, calculate its actual SHA256 hash using Streams API and print the result: + ``` + file1.txt — OK + file2.txt — FAIL + ``` + If `checksums.json` doesn't exist, `Error` with message `FS operation failed` must be thrown. ### Streams (src/streams) -- `npm run streams:lineNumberer` - Add line numbers to stdin input -- `npm run streams:filter` - Filter stdin lines by pattern -- `npm run streams:split` - Split file into chunks +You should implement several functions in dedicated files: + +- `lineNumberer.js` — implement function that reads data from `process.stdin`, prepends each line with its line number (starting from 1) using a Transform Stream, and writes the result to `process.stdout`. Example: input `hello\nworld` → output `1 | hello\n2 | world` + +- `filter.js` — implement function that reads data from `process.stdin`, filters only lines that contain a pattern (given as a CLI argument `--pattern `), and writes matching lines to `process.stdout` using a Transform Stream + +- `split.js` — implement function that reads file `source.txt` using a Readable Stream and splits it into chunk files: `chunk_1.txt`, `chunk_2.txt`, etc. Each chunk should contain at most N lines (N is given as a CLI argument `--lines `, default: 10). Must use Streams API. ### Zlib (src/zip) -- `npm run zip:compressDir` - Compress directory to .br archive -- `npm run zip:decompressDir` - Decompress .br archive +You should implement several functions in dedicated files: + +- `compressDir.js` — implement function that reads all files from the `workspace/toCompress/` directory, recursively compresses the entire directory structure (preserving directory paths and file names) into a single `.br` archive file `archive.br` and saves it to `workspace/compressed/` directory (creating it if it doesn't exist). Must use Streams API. If `toCompress` doesn't exist, `Error` with message `FS operation failed` must be thrown. + +- `decompressDir.js` — implement function that reads the `archive.br` file from `workspace/compressed/`, decompresses it, and extracts the original directory structure with all files to `workspace/decompressed/` directory (creating it if it doesn't exist). The decompressed content must match the original. If `compressed` doesn't exist or `archive.br` doesn't exist, `Error` with message `FS operation failed` must be thrown. ### Worker Threads (src/wt) -- `npm run wt:main` - Parallel sorting with worker threads +You should implement several functions in dedicated files: + +- `worker.js` — implement a function that receives an array of numbers from the main thread, sorts them in ascending order, and sends the sorted array back to the main thread + +- `main.js` — implement function that reads a JSON file `data.json` containing an array of numbers (e.g. `[5, 3, 8, 1, 9, 2, ...]`). The function should: + 1. Split the array into N chunks (where N = number of logical CPU cores) + 2. Create N worker threads from `worker.js`, sending one chunk to each + 3. Collect sorted chunks from all workers + 4. Merge the sorted chunks into a single sorted array (using k-way merge algorithm) + 5. Log the final sorted array to the console + + The results must be collected in the same order as workers were created. ### Child Processes (src/cp) -- `npm run cp:execCommand` - Execute command in child process +You should implement a function in a dedicated file: + +- `execCommand.js` — implement function `execCommand` that takes a command string as a CLI argument (e.g. `node src/cp/execCommand.js "ls -la"`), spawns it as a child process using `spawn`, and: + - pipes the child's `stdout` to `process.stdout` + - pipes the child's `stderr` to `process.stderr` + - passes environment variables from the parent process to the child process + - when the child exits, the parent process exits with the same exit code + + + # Scoring: Node.js Fundamentals + +Max total score: 200 + + +## Check + +For check simplification you have npm-scripts in `package.json`. +NB! Some scripts have predefined data (e.g. environment variables, CLI arguments). Feel free to change it during the check if necessary. + +## Basic Scope + +- File System (src/fs) + - **+10** `snapshot.js` implemented properly (recursive scan, correct JSON structure with path/type/size) + - **+10** `restore.js` implemented properly (reads snapshot, recreates structure) + - **+6** `findByExt.js` implemented properly (recursive search, sorted output) + - **+6** `merge.js` implemented properly (reads .txt files in order, concatenates, writes result) +- CLI (src/cli) + - **+10** `interactive.js` implemented properly (readline prompt, supports uptime/cwd/date/exit commands, handles Ctrl+C) + - **+10** `progress.js` implemented properly (in-place updating progress bar, 0-100% over ~5 seconds) + - **+6** `progress.js` supports `--color ` (`#RRGGBB`), applies color only to filled segment, and resets ANSI style correctly +- Modules (src/modules) + - **+10** `dynamic.js` implemented properly (dynamic import from plugins/, calls run(), handles missing plugin) +- Hash (src/hash) + - **+12** `verify.js` implemented properly (reads checksums.json, calculates SHA256 via Streams, prints OK/FAIL per file) +- Streams (src/streams) + - **+10** `lineNumberer.js` implemented properly (Transform stream, prepends line numbers) + - **+10** `filter.js` implemented properly (Transform stream, filters by pattern from CLI arg) + - **+10** `split.js` implemented properly (Readable stream, splits file into chunks by line count) +- Zlib (src/zip) + - **+10** `compressDir.js` implemented properly (reads all files from workspace/toCompress/, recursively compresses entire directory structure into single .br archive, saves to workspace/compressed/) + - **+10** `decompressDir.js` implemented properly (reads archive.br from workspace/compressed/, decompresses and extracts to workspace/decompressed/, result matches original) + +## Advanced Scope -## Submission +- Worker Threads (src/wt) + - **+10** `worker.js` implemented properly (receives array, returns sorted array) + - **+30** `main.js` implemented properly (reads data.json, splits by CPU count, distributes to workers, k-way merges results) +- Child Processes (src/cp) + - **+10** `execCommand.js` spawns child process from CLI argument + - **+10** child process stdout/stderr piped to parent stdout/stderr + - **+10** parent exits with the same exit code as child -1. Implement all the required functionality in the corresponding files -2. Test your solutions using the npm scripts provided -3. Commit your changes to your forked repository -4. Submit the link to your repository for review +## Forfeits -## !!! Please don't submit Pull Requests to this repository !!! +- **-95% of total task score** Any external tools/libraries are used +- **-30% of total task score** Commits after deadline (except commits that affect only Readme.md, .gitignore, etc.) \ No newline at end of file diff --git a/package.json b/package.json index dfecb12a..7c227fb8 100644 --- a/package.json +++ b/package.json @@ -14,7 +14,7 @@ "fs:merge": "node src/fs/merge.js", "cli:interactive": "node src/cli/interactive.js", "cli:progress": "node src/cli/progress.js", - "modules:dynamic": "node src/modules/dynamic.js uppercase", + "modules:dynamic": "node src/modules/dynamic.js", "hash:verify": "node src/hash/verify.js", "streams:lineNumberer": "echo 'hello\nworld' | node src/streams/lineNumberer.js", "streams:filter": "echo 'hello\nworld\ntest' | node src/streams/filter.js --pattern test", @@ -35,4 +35,4 @@ ], "author": "alreadybored", "license": "ISC" -} +} \ No newline at end of file diff --git a/src/cli/interactive.js b/src/cli/interactive.js index d0e3e0d9..6f833703 100644 --- a/src/cli/interactive.js +++ b/src/cli/interactive.js @@ -1,8 +1,45 @@ -const interactive = () => { - // Write your code here - // Use readline module for interactive CLI - // Support commands: uptime, cwd, date, exit - // Handle Ctrl+C and unknown commands -}; - -interactive(); +import readline from 'readline'; + +const rl = readline.createInterface({ + input: process.stdin, + output: process.stdout, + prompt: '> ' +}); + +const startTime = Date.now(); + +rl.prompt(); + +rl.on('line', (line) => { + const command = line.trim(); + + switch (command) { + case 'uptime': + const uptimeSeconds = (Date.now() - startTime) / 1000; + console.log(`Uptime: ${uptimeSeconds.toFixed(2)}s`); + break; + + case 'cwd': + console.log(process.cwd()); + break; + + case 'date': + console.log(new Date().toISOString()); + break; + + case 'exit': + console.log('Goodbye!'); + process.exit(0); + break; + + default: + console.log('Unknown command'); + } + + rl.prompt(); +}); + +rl.on('SIGINT', () => { + console.log('\nGoodbye!'); + process.exit(0); +}); \ No newline at end of file diff --git a/src/cli/progress.js b/src/cli/progress.js index 3e060763..b784d543 100644 --- a/src/cli/progress.js +++ b/src/cli/progress.js @@ -1,8 +1,59 @@ -const progress = () => { - // Write your code here - // Simulate progress bar from 0% to 100% over ~5 seconds - // Update in place using \r every 100ms - // Format: [████████████████████ ] 67% -}; - -progress(); + +const args = process.argv.slice(2); +let duration = 5000; +let interval = 100; +let length = 30; +let color = null; + +for (let i = 0; i < args.length; i++) { + switch (args[i]) { + case '--duration': + duration = parseInt(args[++i]) || duration; + break; + case '--interval': + interval = parseInt(args[++i]) || interval; + break; + case '--length': + length = parseInt(args[++i]) || length; + break; + case '--color': + color = args[++i]; + + if (!/^#[0-9A-Fa-f]{6}$/.test(color)) { + color = null; + } + break; + } +} + +const steps = duration / interval; +let currentStep = 0; + +function renderProgress() { + const percent = Math.min(100, (currentStep / steps) * 100); + const filledLength = Math.floor((percent / 100) * length); + const emptyLength = length - filledLength; + + const filled = '█'.repeat(filledLength); + const empty = '░'.repeat(emptyLength); + + let bar = `[${filled}${empty}] ${percent.toFixed(0)}%`; + + if (color && filledLength > 0) { + + const coloredFilled = `\x1b[38;2;${parseInt(color.slice(1,3), 16)};${parseInt(color.slice(3,5), 16)};${parseInt(color.slice(5,7), 16)}m${filled}\x1b[0m`; + bar = `[${coloredFilled}${empty}] ${percent.toFixed(0)}%`; + } + + process.stdout.write(`\r${bar}`); + + if (currentStep >= steps) { + console.log('\nDone!'); + clearInterval(timer); + } + + currentStep++; +} + +const timer = setInterval(renderProgress, interval); +renderProgress(); \ No newline at end of file diff --git a/src/cp/execCommand.js b/src/cp/execCommand.js index 34a89c8d..a12f31c2 100644 --- a/src/cp/execCommand.js +++ b/src/cp/execCommand.js @@ -1,10 +1,33 @@ -const execCommand = () => { - // Write your code here - // Take command from CLI argument - // Spawn child process - // Pipe child stdout/stderr to parent stdout/stderr - // Pass environment variables - // Exit with same code as child -}; - -execCommand(); +import { spawn } from 'child_process'; + +function execCommand() { + + const command = process.argv.slice(2).join(' '); + + if (!command) { + console.error('No command provided'); + process.exit(1); + } + + const [cmd, ...args] = command.split(' '); + + const child = spawn(cmd, args, { + stdio: ['pipe', 'pipe', 'pipe', 'ipc'], + env: process.env + }); + + + child.stdout.pipe(process.stdout); + child.stderr.pipe(process.stderr); + + child.on('exit', (code) => { + process.exit(code); + }); + + child.on('error', (err) => { + console.error('Failed to start child process:', err); + process.exit(1); + }); +} + +execCommand(); \ No newline at end of file diff --git a/src/fs/findByExt.js b/src/fs/findByExt.js index 24f06cb8..e3f7446c 100644 --- a/src/fs/findByExt.js +++ b/src/fs/findByExt.js @@ -1,7 +1,51 @@ -const findByExt = async () => { - // Write your code here - // Recursively find all files with specific extension - // Parse --ext CLI argument (default: .txt) -}; +import fs from 'fs/promises'; +import path from 'path'; -await findByExt(); +async function findByExt() { + try { + const workspacePath = path.join(process.cwd(), 'workspace'); + + try { + await fs.access(workspacePath); + } catch { + throw new Error('FS operation failed'); + } + + const extArgIndex = process.argv.indexOf('--ext'); + let extension = 'txt'; + + if (extArgIndex !== -1 && process.argv[extArgIndex + 1]) { + extension = process.argv[extArgIndex + 1].replace(/^\./, ''); + } + + async function findFiles(dir, baseDir) { + const files = await fs.readdir(dir, { withFileTypes: true }); + const results = []; + + for (const file of files) { + const fullPath = path.join(dir, file.name); + + if (file.isDirectory()) { + const subResults = await findFiles(fullPath, baseDir); + results.push(...subResults); + } else if (file.isFile() && file.name.endsWith(`.${extension}`)) { + results.push(path.relative(baseDir, fullPath)); + } + } + + return results; + } + + const foundFiles = await findFiles(workspacePath, workspacePath); + + foundFiles.sort().forEach(file => console.log(file)); + + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await findByExt(); \ No newline at end of file diff --git a/src/fs/merge.js b/src/fs/merge.js index cb8e0d8f..56f0021c 100644 --- a/src/fs/merge.js +++ b/src/fs/merge.js @@ -1,8 +1,63 @@ -const merge = async () => { - // Write your code here - // Default: read all .txt files from workspace/parts in alphabetical order - // Optional: support --files filename1,filename2,... to merge specific files in provided order - // Concatenate content and write to workspace/merged.txt -}; - -await merge(); +import fs from 'fs/promises'; +import path from 'path'; +import { createReadStream, createWriteStream } from 'fs'; +import { pipeline } from 'stream/promises'; + +async function merge() { + try { + const partsPath = path.join(process.cwd(), 'workspace', 'parts'); + const mergedPath = path.join(process.cwd(), 'workspace', 'merged.txt'); + + try { + await fs.access(partsPath); + } catch { + throw new Error('FS operation failed'); + } + + let filesToMerge = []; + const filesArgIndex = process.argv.indexOf('--files'); + + if (filesArgIndex !== -1 && process.argv[filesArgIndex + 1]) { + + const fileList = process.argv[filesArgIndex + 1].split(','); + + for (const file of fileList) { + const filePath = path.join(partsPath, file.trim()); + try { + await fs.access(filePath); + filesToMerge.push(filePath); + } catch { + throw new Error('FS operation failed'); + } + } + } else { + + const files = await fs.readdir(partsPath); + filesToMerge = files + .filter(file => file.endsWith('.txt')) + .sort() + .map(file => path.join(partsPath, file)); + } + + if (filesToMerge.length === 0) { + throw new Error('FS operation failed'); + } + + const writeStream = createWriteStream(mergedPath); + + for (const file of filesToMerge) { + const readStream = createReadStream(file); + await pipeline(readStream, writeStream, { end: false }); + } + + writeStream.end(); + + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await merge(); \ No newline at end of file diff --git a/src/fs/restore.js b/src/fs/restore.js index 96ae1ffb..aa7becbb 100644 --- a/src/fs/restore.js +++ b/src/fs/restore.js @@ -1,8 +1,56 @@ -const restore = async () => { - // Write your code here - // Read snapshot.json - // Treat snapshot.rootPath as metadata only - // Recreate directory/file structure in workspace_restored -}; - -await restore(); +import fs from 'fs/promises'; +import path from 'path'; +import { fileURLToPath } from 'url'; + +const __dirname = path.dirname(fileURLToPath(import.meta.url)); + +async function restore() { + try { + const snapshotPath = path.join(process.cwd(), 'snapshot.json'); + const restoredPath = path.join(process.cwd(), 'workspace_restored'); + + try { + await fs.access(snapshotPath); + } catch { + throw new Error('FS operation failed'); + } + + try { + await fs.access(restoredPath); + throw new Error('FS operation failed'); + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + + } + + + const snapshotData = await fs.readFile(snapshotPath, 'utf-8'); + const snapshot = JSON.parse(snapshotData); + + + await fs.mkdir(restoredPath, { recursive: true }); + + for (const entry of snapshot.entries) { + const fullPath = path.join(restoredPath, entry.path); + + if (entry.type === 'directory') { + await fs.mkdir(fullPath, { recursive: true }); + } else if (entry.type === 'file') { + + await fs.mkdir(path.dirname(fullPath), { recursive: true }); + + const content = Buffer.from(entry.content, 'base64'); + await fs.writeFile(fullPath, content); + } + } + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await restore(); \ No newline at end of file diff --git a/src/fs/snapshot.js b/src/fs/snapshot.js index 050103d3..27bfa194 100644 --- a/src/fs/snapshot.js +++ b/src/fs/snapshot.js @@ -1,9 +1,67 @@ -const snapshot = async () => { - // Write your code here - // Recursively scan workspace directory - // Write snapshot.json with: - // - rootPath: absolute path to workspace - // - entries: flat array of relative paths and metadata -}; - -await snapshot(); +import fs from 'fs/promises'; +import path from 'path'; +import { fileURLToPath } from 'url'; + +const __dirname = path.dirname(fileURLToPath(import.meta.url)); + +async function scanDirectory(dirPath, basePath) { + const entries = []; + const files = await fs.readdir(dirPath, { withFileTypes: true }); + + for (const file of files) { + const fullPath = path.join(dirPath, file.name); + const relativePath = path.relative(basePath, fullPath); + + if (file.isDirectory()) { + entries.push({ + path: relativePath, + type: 'directory' + }); + const subEntries = await scanDirectory(fullPath, basePath); + entries.push(...subEntries); + } else if (file.isFile()) { + const stats = await fs.stat(fullPath); + const content = await fs.readFile(fullPath); + entries.push({ + path: relativePath, + type: 'file', + size: stats.size, + content: content.toString('base64') + }); + } + } + + return entries; +} + +async function snapshot() { + try { + const workspacePath = path.join(process.cwd(), 'workspace'); + + try { + await fs.access(workspacePath); + } catch { + throw new Error('FS operation failed'); + } + + const rootPath = workspacePath; + const entries = await scanDirectory(workspacePath, workspacePath); + + const snapshot = { + rootPath, + entries + }; + + await fs.writeFile( + path.join(process.cwd(), 'snapshot.json'), + JSON.stringify(snapshot, null, 2) + ); + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await snapshot(); \ No newline at end of file diff --git a/src/hash/verify.js b/src/hash/verify.js index 7f1e8961..45cec59c 100644 --- a/src/hash/verify.js +++ b/src/hash/verify.js @@ -1,8 +1,50 @@ -const verify = async () => { - // Write your code here - // Read checksums.json - // Calculate SHA256 hash using Streams API - // Print result: filename — OK/FAIL -}; - -await verify(); +import fs from 'fs/promises'; +import path from 'path'; +import crypto from 'crypto'; +import { createReadStream } from 'fs'; + +async function calculateSHA256(filePath) { + return new Promise((resolve, reject) => { + const hash = crypto.createHash('sha256'); + const stream = createReadStream(filePath); + + stream.on('data', (data) => hash.update(data)); + stream.on('end', () => resolve(hash.digest('hex'))); + stream.on('error', reject); + }); +} + +async function verify() { + try { + const checksumsPath = path.join(process.cwd(), 'checksums.json'); + + + try { + await fs.access(checksumsPath); + } catch { + throw new Error('FS operation failed'); + } + + const checksumsData = await fs.readFile(checksumsPath, 'utf-8'); + const checksums = JSON.parse(checksumsData); + + for (const [fileName, expectedHash] of Object.entries(checksums)) { + const filePath = path.join(process.cwd(), fileName); + + try { + const actualHash = await calculateSHA256(filePath); + console.log(`${fileName} — ${actualHash === expectedHash ? 'OK' : 'FAIL'}`); + } catch { + console.log(`${fileName} — FAIL`); + } + } + + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await verify(); \ No newline at end of file diff --git a/src/modules/dynamic.js b/src/modules/dynamic.js index 008ca387..e12cbcd2 100644 --- a/src/modules/dynamic.js +++ b/src/modules/dynamic.js @@ -1,9 +1,31 @@ -const dynamic = async () => { - // Write your code here - // Accept plugin name as CLI argument - // Dynamically import plugin from plugins/ directory - // Call run() function and print result - // Handle missing plugin case -}; - -await dynamic(); +import path from 'path'; +import { fileURLToPath } from 'url'; + +const __dirname = path.dirname(fileURLToPath(import.meta.url)); + +async function dynamic() { + const pluginName = process.argv[2]; + + if (!pluginName) { + console.error('Plugin not found'); + process.exit(1); + } + + try { + const pluginPath = path.join(__dirname, 'plugins', `${pluginName}.js`); + const plugin = await import(pluginPath); + + if (typeof plugin.run === 'function') { + const result = plugin.run(); + console.log(result); + } else { + console.error('Plugin not found'); + process.exit(1); + } + } catch (error) { + console.error('Plugin not found'); + process.exit(1); + } +} + +await dynamic(); \ No newline at end of file diff --git a/src/modules/plugins/repeat.js b/src/modules/plugins/repeat.js index 543a0b61..504032ac 100644 --- a/src/modules/plugins/repeat.js +++ b/src/modules/plugins/repeat.js @@ -1,3 +1,3 @@ -export const run = () => { - return 'hello hello hello'; -}; +export function run() { + return 'repeat repeat repeat'; +} \ No newline at end of file diff --git a/src/modules/plugins/reverse.js b/src/modules/plugins/reverse.js index b399ef71..f00b9f4b 100644 --- a/src/modules/plugins/reverse.js +++ b/src/modules/plugins/reverse.js @@ -1,3 +1,3 @@ -export const run = () => { - return 'dlrow olleh'; -}; +export function run() { + return 'ESREVER ELPMAXE'; +} \ No newline at end of file diff --git a/src/modules/plugins/uppercase.js b/src/modules/plugins/uppercase.js index b8b0c6b7..486241b9 100644 --- a/src/modules/plugins/uppercase.js +++ b/src/modules/plugins/uppercase.js @@ -1,3 +1,3 @@ -export const run = () => { - return 'HELLO WORLD'; -}; +export function run() { + return 'UPPERCASE EXAMPLE'; +} \ No newline at end of file diff --git a/src/streams/filter.js b/src/streams/filter.js index 3868ab46..d7267187 100644 --- a/src/streams/filter.js +++ b/src/streams/filter.js @@ -1,9 +1,32 @@ -const filter = () => { - // Write your code here - // Read from process.stdin - // Filter lines by --pattern CLI argument - // Use Transform Stream - // Write to process.stdout -}; +import { Transform, pipeline } from 'stream'; + +const patternIndex = process.argv.indexOf('--pattern'); +let pattern = ''; -filter(); +if (patternIndex !== -1 && process.argv[patternIndex + 1]) { + pattern = process.argv[patternIndex + 1]; +} + +const filter = new Transform({ + transform(chunk, encoding, callback) { + const lines = chunk.toString().split('\n'); + const filteredLines = lines.filter(line => line.includes(pattern)); + + if (filteredLines.length > 0) { + this.push(filteredLines.join('\n') + (filteredLines.length > 0 ? '\n' : '')); + } + + callback(); + } +}); + +pipeline( + process.stdin, + filter, + process.stdout, + (err) => { + if (err) { + console.error('Pipeline failed:', err); + } + } +); \ No newline at end of file diff --git a/src/streams/lineNumberer.js b/src/streams/lineNumberer.js index 579d662e..e6c0abf1 100644 --- a/src/streams/lineNumberer.js +++ b/src/streams/lineNumberer.js @@ -1,8 +1,30 @@ -const lineNumberer = () => { - // Write your code here - // Read from process.stdin - // Use Transform Stream to prepend line numbers - // Write to process.stdout -}; +import { Transform, pipeline } from 'stream'; -lineNumberer(); +let lineNumber = 1; + +const lineNumberer = new Transform({ + transform(chunk, encoding, callback) { + const lines = chunk.toString().split('\n'); + + for (let i = 0; i < lines.length; i++) { + if (lines[i].length > 0) { + lines[i] = `${lineNumber} | ${lines[i]}`; + lineNumber++; + } + } + + this.push(lines.join('\n')); + callback(); + } +}); + +pipeline( + process.stdin, + lineNumberer, + process.stdout, + (err) => { + if (err) { + console.error('Pipeline failed:', err); + } + } +); \ No newline at end of file diff --git a/src/streams/split.js b/src/streams/split.js index f8f814fa..b3f66967 100644 --- a/src/streams/split.js +++ b/src/streams/split.js @@ -1,8 +1,99 @@ -const split = async () => { - // Write your code here - // Read source.txt using Readable Stream - // Split into chunk_1.txt, chunk_2.txt, etc. - // Each chunk max N lines (--lines CLI argument, default: 10) -}; - -await split(); +import fs from 'fs'; +import path from 'path'; +import { createReadStream, createWriteStream } from 'fs'; +import { Transform } from 'stream'; + +async function split() { + + const linesIndex = process.argv.indexOf('--lines'); + let linesPerChunk = 10; + + if (linesIndex !== -1 && process.argv[linesIndex + 1]) { + linesPerChunk = parseInt(process.argv[linesIndex + 1]) || 10; + } + + const sourcePath = path.join(process.cwd(), 'source.txt'); + + try { + await fs.promises.access(sourcePath); + } catch { + console.error('source.txt not found'); + process.exit(1); + } + + let chunkNumber = 1; + let lineCount = 0; + let currentChunkStream = null; + + function createChunkStream() { + const chunkPath = path.join(process.cwd(), `chunk_${chunkNumber}.txt`); + console.log(`Создаю файл: chunk_${chunkNumber}.txt`); + return createWriteStream(chunkPath); + } + + currentChunkStream = createChunkStream(); + + const splitter = new Transform({ + transform(chunk, encoding, callback) { + const data = chunk.toString(); + const lines = data.split('\n'); + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + + if (i === lines.length - 1 && line === '') { + continue; + } + + if (lineCount >= linesPerChunk) { + + currentChunkStream.end(); + chunkNumber++; + currentChunkStream = createChunkStream(); + lineCount = 0; + } + + currentChunkStream.write(line + (i < lines.length - 1 ? '\n' : '')); + lineCount++; + } + + callback(); + }, + + flush(callback) { + + if (currentChunkStream) { + currentChunkStream.end(); + console.log(`Готово! Создано ${chunkNumber} файлов.`); + } + callback(); + } + }); + + + const readStream = createReadStream(sourcePath, { encoding: 'utf8' }); + + + readStream.on('error', (err) => { + console.error('Ошибка чтения файла:', err); + process.exit(1); + }); + + splitter.on('error', (err) => { + console.error('Ошибка обработки:', err); + process.exit(1); + }); + + + readStream.pipe(splitter); + + return new Promise((resolve, reject) => { + splitter.on('finish', resolve); + splitter.on('error', reject); + }); +} + +await split().catch(err => { + console.error('Ошибка:', err); + process.exit(1); +}); \ No newline at end of file diff --git a/src/wt/main.js b/src/wt/main.js index d7d21f0c..0cd95005 100644 --- a/src/wt/main.js +++ b/src/wt/main.js @@ -1,11 +1,87 @@ -const main = async () => { - // Write your code here - // Read data.json containing array of numbers - // Split into N chunks (N = CPU cores) - // Create N workers, send one chunk to each - // Collect sorted chunks - // Merge using k-way merge algorithm - // Log final sorted array -}; - -await main(); +import fs from 'fs/promises'; +import path from 'path'; +import { Worker } from 'worker_threads'; +import os from 'os'; +import { fileURLToPath } from 'url'; + +const __dirname = path.dirname(fileURLToPath(import.meta.url)); + +async function main() { + try { + + const dataPath = path.join(process.cwd(), 'data.json'); + const data = JSON.parse(await fs.readFile(dataPath, 'utf-8')); + + const numCPUs = os.cpus().length; + const chunkSize = Math.ceil(data.length / numCPUs); + const chunks = []; + + for (let i = 0; i < numCPUs; i++) { + const start = i * chunkSize; + const end = Math.min(start + chunkSize, data.length); + if (start < data.length) { + chunks.push(data.slice(start, end)); + } + } + + const workers = []; + const results = []; + + for (let i = 0; i < chunks.length; i++) { + const worker = new Worker(path.join(__dirname, 'worker.js')); + + workers.push(new Promise((resolve, reject) => { + worker.on('message', (sortedChunk) => { + results[i] = sortedChunk; + resolve(); + }); + + worker.on('error', reject); + worker.on('exit', (code) => { + if (code !== 0) { + reject(new Error(`Worker stopped with exit code ${code}`)); + } + }); + + worker.postMessage(chunks[i]); + })); + } + + await Promise.all(workers); + + + function mergeKSortedArrays(arrays) { + const result = []; + const indices = new Array(arrays.length).fill(0); + + while (true) { + let minValue = Infinity; + let minIndex = -1; + + + for (let i = 0; i < arrays.length; i++) { + if (indices[i] < arrays[i].length && arrays[i][indices[i]] < minValue) { + minValue = arrays[i][indices[i]]; + minIndex = i; + } + } + + if (minIndex === -1) break; + + result.push(minValue); + indices[minIndex]++; + } + + return result; + } + + const merged = mergeKSortedArrays(results); + console.log(merged); + + } catch (error) { + console.error('Error:', error); + process.exit(1); + } +} + +await main(); \ No newline at end of file diff --git a/src/wt/worker.js b/src/wt/worker.js index 15f42fc8..26443f5c 100644 --- a/src/wt/worker.js +++ b/src/wt/worker.js @@ -1,9 +1,9 @@ import { parentPort } from 'worker_threads'; -// Receive array from main thread -// Sort in ascending order -// Send back to main thread - -parentPort.on('message', (data) => { - // Write your code here -}); +if (parentPort) { + parentPort.on('message', (numbers) => { + + const sorted = numbers.sort((a, b) => a - b); + parentPort.postMessage(sorted); + }); +} \ No newline at end of file diff --git a/src/zip/compressDir.js b/src/zip/compressDir.js index 3a3c5089..58c7c597 100644 --- a/src/zip/compressDir.js +++ b/src/zip/compressDir.js @@ -1,9 +1,91 @@ -const compressDir = async () => { - // Write your code here - // Read all files from workspace/toCompress/ - // Compress entire directory structure into archive.br - // Save to workspace/compressed/ - // Use Streams API -}; - -await compressDir(); +import fs from 'fs/promises'; +import path from 'path'; +import { createReadStream, createWriteStream } from 'fs'; +import { pipeline } from 'stream/promises'; +import zlib from 'zlib'; +import { Readable } from 'stream'; + +async function compressDir() { + try { + const toCompressPath = path.join(process.cwd(), 'workspace', 'toCompress'); + const compressedDir = path.join(process.cwd(), 'workspace', 'compressed'); + const archivePath = path.join(compressedDir, 'archive.br'); + + // Проверяем существование toCompress + try { + await fs.access(toCompressPath); + } catch { + throw new Error('FS operation failed'); + } + + // Создаем директорию compressed если её нет + await fs.mkdir(compressedDir, { recursive: true }); + + // Функция для создания потока с метаданными и содержимым файлов + async function* generateFileStream() { + // Рекурсивно обходим директорию + async function* walk(dir, baseDir = dir) { + const entries = await fs.readdir(dir, { withFileTypes: true }); + + for (const entry of entries) { + const fullPath = path.join(dir, entry.name); + const relativePath = path.relative(baseDir, fullPath); + + if (entry.isDirectory()) { + // Отправляем метаданные о директории + yield JSON.stringify({ + type: 'directory', + path: relativePath + }) + '\n'; + + // Рекурсивно обходим поддиректорию + yield* walk(fullPath, baseDir); + } else { + // Для файлов отправляем метаданные + const stats = await fs.stat(fullPath); + yield JSON.stringify({ + type: 'file', + path: relativePath, + size: stats.size + }) + '\n'; + + // Отправляем содержимое файла + const fileStream = createReadStream(fullPath); + for await (const chunk of fileStream) { + yield chunk; + } + } + } + } + + yield* walk(toCompressPath); + } + + // Создаем читаемый поток из генератора + const readStream = Readable.from(generateFileStream()); + + // Создаем Brotli компрессор и выходной поток + const brotli = zlib.createBrotliCompress({ + params: { + [zlib.constants.BROTLI_PARAM_QUALITY]: 11, // Максимальное сжатие + } + }); + + const writeStream = createWriteStream(archivePath); + + // Соединяем все потоки + console.log('Начинаем сжатие...'); + await pipeline(readStream, brotli, writeStream); + + console.log(`Архив успешно создан: ${archivePath}`); + + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + console.error('Ошибка при сжатии:', error); + throw new Error('FS operation failed'); + } +} + +await compressDir(); \ No newline at end of file diff --git a/src/zip/decompressDir.js b/src/zip/decompressDir.js index d6e770f6..d26d7eef 100644 --- a/src/zip/decompressDir.js +++ b/src/zip/decompressDir.js @@ -1,8 +1,44 @@ -const decompressDir = async () => { - // Write your code here - // Read archive.br from workspace/compressed/ - // Decompress and extract to workspace/decompressed/ - // Use Streams API -}; - -await decompressDir(); +import fs from 'fs/promises'; +import path from 'path'; +import { createReadStream, createWriteStream } from 'fs'; +import { pipeline } from 'stream/promises'; +import zlib from 'zlib'; + +async function decompressDir() { + try { + const compressedDir = path.join(process.cwd(), 'workspace', 'compressed'); + const archivePath = path.join(compressedDir, 'archive.br'); + const decompressedDir = path.join(process.cwd(), 'workspace', 'decompressed'); + + + try { + await fs.access(compressedDir); + await fs.access(archivePath); + } catch { + throw new Error('FS operation failed'); + } + + + await fs.mkdir(decompressedDir, { recursive: true }); + + const brotli = zlib.createBrotliDecompress(); + const readStream = createReadStream(archivePath); + const tempPath = path.join(decompressedDir, 'temp.tar'); + const writeStream = createWriteStream(tempPath); + + await pipeline(readStream, brotli, writeStream); + + const tempContent = await fs.readFile(tempPath); + await fs.writeFile(path.join(decompressedDir, 'extracted.dat'), tempContent); + + await fs.unlink(tempPath); + + } catch (error) { + if (error.message === 'FS operation failed') { + throw error; + } + throw new Error('FS operation failed'); + } +} + +await decompressDir(); \ No newline at end of file diff --git a/test1.txt b/test1.txt new file mode 100644 index 00000000..557db03d --- /dev/null +++ b/test1.txt @@ -0,0 +1 @@ +Hello World diff --git a/test2.txt b/test2.txt new file mode 100644 index 00000000..a4542f84 --- /dev/null +++ b/test2.txt @@ -0,0 +1 @@ +Node.js is awesome diff --git a/test3.txt b/test3.txt new file mode 100644 index 00000000..83fe5859 --- /dev/null +++ b/test3.txt @@ -0,0 +1 @@ +Third test file