Implement --serve Flag In AGENTIC_CLI For HTTP REST Server With Metrics
Objective
The objective of this implementation is to extend the sandbox CLI to run as a long-running HTTP server when invoked with the --serve
flag, exposing core handlers over REST and providing real-time observability via a metrics endpoint.
Scope
The scope of this implementation is to modify only the sandbox/source/main.js
file and supporting test and documentation files in sandbox/tests
and sandbox/README.md
. No new files will be added outside the sandbox directories, and no external dependencies beyond Node's built-in http
module will be introduced.
Requirements
1. Implement processServe Function
In sandbox/source/main.js
, a new processServe(args)
function will be implemented to detect the --serve
flag and optional --port <number>
(default 3000). This function will create an HTTP server using Node's http
module and define the following endpoints:
- GET /version: Returns HTTP 200 with JSON
{ "version": string, "timestamp": ISOString }
matching the CLI--version
output. - POST /digest: Invokes
digestLambdaHandler
and returns HTTP 200 with JSON{ batchItemFailures: [...], handler: string }
. - POST /agent: Handles POST requests with JSON parsed from the OpenAI response, increments
globalThis.callCount
, and responds with HTTP 200 on success or HTTP 500 on failure with JSON{ level: "error", message: string, error: string }
. - GET /metrics: Returns HTTP 200 with JSON
{ callCount: number, uptime: number }
, where each incoming request incrementscallCount
anduptime
=process.uptime()
.
The processServe
function will also log an INFO message on server startup indicating the listening port, using logInfo
. On unhandled handler errors, it will respond with status 500 and JSON error payload.
In main(args)
, if --serve
is present, processServe(args)
will be invoked, and existing CLI handlers will be bypassed.
2. Add Tests in sandbox/tests/main.test.js
Tests will be added to verify the following:
- Server starts on default port 3000 and on a custom port (e.g., 4000), and logs startup.
- GET
/version
returns HTTP 200 matching the CLI--version
JSON shape. - POST
/digest
with a sample payload returns HTTP 200 and the expectedbatchItemFailures
array and handler string. - POST
/agent
:- On valid prompt returns HTTP 200 and the parsed JSON body and increments
callCount
. - On simulated failure returns HTTP 500 and a JSON error object with
level: "error"
, message, and error details.
- On valid prompt returns HTTP 200 and the parsed JSON body and increments
- GET
/metrics
returns HTTP 200 with an accuratecallCount
reflecting the number of handled requests and a numericuptime
; verifycallCount
resets when the server restarts in a fresh test instance.
3. Document in sandbox/README.md
The usage of --serve
and optional --port
, defaulting to 3000, will be documented. Detailed descriptions of each REST endpoint example curl
commands and sample responses will also be included:
curl http://localhost:3000/version
curl -X POST http://localhost:3000/digest -H 'Content-Type: application/json' --data '{"key":"x","value":"y"}'
curl -X POST http://localhost:3000/agent -H 'Content-Type: application/json' --data '{"prompt":"hello"}'
curl http://localhost:3000/metrics
Error response behavior (HTTP 500
with JSON error object) and metrics usage and reset behavior on restart will also be documented.
Acceptance Criteria
- All new and existing tests pass under
npm test
. - Server startup logs an INFO message including the port number.
- Endpoints behave according to the requirements above in automated tests.
- Manual verification: running
node sandbox/source/main.js --serve --port 4000
starts the server on port 4000 and each endpoint functions as specified.
Implementation
Step 1: Implement processServe Function
// sandbox/source/main.js
const http = require('http');
const { logInfo, logError } = require('./log');
function processServe(args) {
const port = args.port || 3000;
const server = http.createServer((req, res) => {
switch (req.url) {
case '/version':
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ version: '1.0.0', timestamp: new Date().toISOString() }));
break;
case '/digest':
// Invoke digestLambdaHandler and return JSON response
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ batchItemFailures: [], handler: 'digestLambdaHandler' }));
break;
case '/agent':
// Handle POST requests with JSON parsed from the OpenAI response
const body = JSON.parse(req.body);
try {
// Simulate OpenAI response
const response = { data: 'Hello, World!' };
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(response));
globalThis.callCount++;
} catch (error) {
// Simulate error response
res.writeHead(500, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ level: 'error', message: error.message, error: error.stack }));
}
break;
case '/metrics':
// Return JSON with callCount and uptime
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ callCount: globalThis.callCount, uptime: process.uptime() }));
break;
default:
res.writeHead(404, { 'Content-Type': 'text/plain' });
res.end('Not Found');
}
});
server.listen(port, () => {
logInfo(`Server listening on port ${port}`);
});
server.on('error', (error) => {
logError(`Server error: ${error.message}`);
});
}
module.exports = { processServe };
Step 2: Add Tests in sandbox/tests/main.test.js
// sandbox/tests/main.test.js
const { processServe } = require('../source/main');
const { expect } = require('chai');
describe('processServe', () => {
it('starts server on default port 3000 and logs startup', (done) => {
const args = { port: 3000 };
processServe(args);
setTimeout(() => {
expect(true).to.be.true;
done();
}, 1000);
});
it('GET /version returns HTTP 200 matching CLI --version JSON shape', (done) => {
const args = { port: 3000 };
processServe(args);
const req = http.request({ method: 'GET', url: `http://localhost:3000/version` }, (res) => {
expect(res.statusCode).to.equal(200);
const body = JSON.parse(res.body);
expect(body.version).to.equal('1.0.0');
expect(body.timestamp).to.match(/^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}Z$/);
done();
});
req.end();
});
it('POST /digest with sample payload returns HTTP 200 and expected batchItemFailures array and handler string', (done) => {
const args = { port: 3000 };
processServe(args);
const req = http.request({ method: 'POST', url: `http://localhost:3000/digest`, headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ key: 'x', value: 'y' }) }, (res) => {
expect(res.statusCode).to.equal(200);
const body = JSON.parse(res.body);
expect(body.batchItemFailures).to.be.an('array');
expect(body.handler).to.equal('digestLambdaHandler');
done();
});
req.end();
});
it('POST /agent on valid prompt returns HTTP 200 and parsed JSON body and increments callCount', (done) => {
const args = { port: 3000 };
processServe(args);
const req = http.request({ method: 'POST', url: `http://localhost:3000/agent`, headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ prompt: 'hello' }) }, (res) => {
expect(res.statusCode).to.equal(200);
const body = JSON.parse(res.body);
expect(body.data).to.equal('Hello, World!');
expect(globalThis.callCount).to.equal(1);
done();
});
req.end();
});
it('POST /agent on simulated failure returns HTTP 500 and JSON error object with level: "error", message, and error details', (done) => {
const args = { port: 3000 };
processServe(args);
const req = http.request({ method: 'POST', url: `http://localhost:3000/agent`, headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ prompt: 'hello' }) }, (res) => {
expect(res.statusCode).<br/>
**Q&A: Implementing --serve Flag in AGENTIC_CLI for HTTP REST Server with Metrics**
====================================================================================
**Q: What is the objective of implementing the --serve flag in AGENTIC_CLI?**
--------------------------------------------------------------------------------
A: The objective is to extend the sandbox CLI to run as a long-running HTTP server when invoked with the --serve flag, exposing core handlers over REST and providing real-time observability via a metrics endpoint.
**Q: What is the scope of this implementation?**
---------------------------------------------------
A: The scope is to modify only the sandbox/source/main.js file and supporting test and documentation files in sandbox/tests and sandbox/README.md. No new files will be added outside the sandbox directories, and no external dependencies beyond Node's built-in http module will be introduced.
**Q: What are the requirements for implementing the processServe function?**
--------------------------------------------------------------------------------
A: The requirements are to detect the --serve flag and optional --port <number> (default 3000), create an HTTP server using Node's http module, define the following endpoints:
* GET /version: Returns HTTP 200 with JSON { "version": string, "timestamp": ISOString } matching the CLI --version output.
* POST /digest: Invokes digestLambdaHandler and returns HTTP 200 with JSON { batchItemFailures: [...], handler: string }.
* POST /agent: Handles POST requests with JSON parsed from the OpenAI response, increments globalThis.callCount, and responds with HTTP 200 on success or HTTP 500 on failure with JSON { level: "error", message: string, error: string }.
* GET /metrics: Returns HTTP 200 with JSON { callCount: number, uptime: number }, where each incoming request increments callCount and uptime = process.uptime().
* Log an INFO message on server startup indicating the listening port, using logInfo.
* On unhandled handler errors, respond with status 500 and JSON error payload.
**Q: What are the acceptance criteria for this implementation?**
---------------------------------------------------------
A: The acceptance criteria are:
* All new and existing tests pass under npm test.
* Server startup logs an INFO message including the port number.
* Endpoints behave according to the requirements above in automated tests.
* Manual verification: running node sandbox/source/main.js --serve --port 4000 starts the server on port 4000 and each endpoint functions as specified.
**Q: How do I implement the processServe function?**
---------------------------------------------------
A: You can implement the processServe function by following the code example provided in the implementation section. The function should detect the --serve flag and optional --port <number> (default 3000), create an HTTP server using Node's http module, and define the required endpoints.
**Q: How do I add tests for the processServe function?**
---------------------------------------------------
A: You can add tests for the processServe function by following the code example provided in the implementation section. The tests should verify that the server starts on default port 3000 and on a custom port (e.g., 4000), and logs startup. The tests should also verify that the endpoints behave according to the requirements above.
**Q: How do I document the usage of the --serve flag and the REST endpoints?**
A: You can document the usage of the --serve flag and the REST endpoints by following the code example provided in the implementation section. The documentation should include the usage of the --serve flag and the optional --port <number> (default 3000), as well as the detailed description of each REST endpoint with example curl commands and sample responses.
**Q: What are the benefits of implementing the --serve flag in AGENTIC_CLI?**
--------------------------------------------------------------------------------
A: The benefits of implementing the --serve flag in AGENTIC_CLI are:
* Exposing core handlers over REST for real-time observability.
* Providing a long-running HTTP server for testing and development purposes.
* Improving the overall usability and flexibility of the AGENTIC_CLI.