Sync eng/common directory with azure-sdk-tools for PR 15278#9895
Sync eng/common directory with azure-sdk-tools for PR 15278#9895
Conversation
Adds an opt-in capability to the shared TypeSpec emitter pipeline template that bundles an emitter package and uploads it to the typespec playground blob storage. The uploaded <pkgName>/latest.json import map is consumed by in-browser playgrounds (e.g. https://azure.github.io/typespec-azure) via their additionalPlaygroundPackages mechanism. Self-contained tooling lives in eng/common/playground-bundle/ and mirrors @typespec/bundle-uploader from microsoft/typespec (which is private and cannot be installed from npm). Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
Adds a self-contained Node-based uploader under eng/common to bundle TypeSpec emitter packages and publish them to the TypeSpec playground storage, plus an opt-in pipeline hook to run it from the shared TypeSpec emitter archetype.
Changes:
- Added
eng/common/playground-bundle/upload.mjsto bundle an emitter package via@typespec/bundlerand upload bundle assets + manifests totypespecstorage. - Added a pinned-dependency Node package (
package.json+package-lock.json) and documentation for the uploader tool. - Extended
archetype-typespec-emitter.ymlwith a newUploadPlaygroundBundleparameter and an internal non-PR upload step usingAzureCLI@2.
Reviewed changes
Copilot reviewed 4 out of 5 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| eng/common/playground-bundle/upload.mjs | Implements bundling + blob upload workflow for TypeSpec playground consumption. |
| eng/common/playground-bundle/package.json | Declares pinned runtime dependencies and Node engine requirement for the uploader. |
| eng/common/playground-bundle/package-lock.json | Locks dependency graph for reproducible installs in pipelines. |
| eng/common/playground-bundle/README.md | Documents purpose and pipeline invocation of the uploader. |
| eng/common/pipelines/templates/archetype-typespec-emitter.yml | Adds opt-in pipeline steps to install and run the uploader on internal non-PR builds. |
Files not reviewed (1)
- eng/common/playground-bundle/package-lock.json: Language not supported
| const manifestExisted = await manifestAlreadyExists(container, bundle.manifest); | ||
| if (!manifestExisted) { | ||
| for (const file of bundle.files) { | ||
| await uploadJsFile(container, bundle.manifest.name, resolvedVersion, file); | ||
| } | ||
| await uploadManifest(container, bundle.manifest); | ||
| console.log(`✔ Uploaded bundle ${bundle.manifest.name}@${resolvedVersion}`); | ||
| } else { | ||
| console.log(`Bundle ${bundle.manifest.name}@${resolvedVersion} already exists, skipping upload.`); | ||
| } |
There was a problem hiding this comment.
The existence check + conditional upload has a TOCTOU race: another pipeline/job can upload the manifest between manifestAlreadyExists and uploadManifest, causing uploadManifest to fail on ifNoneMatch: "*" and fail the job even though the desired state is reached. Consider catching the specific Storage error (e.g., condition-not-met / 412) and treating it as "already exists" (skip) to make retries/concurrent runs fully idempotent.
| const content = JSON.stringify(manifest); | ||
| await blob.upload(content, content.length, { | ||
| blobHTTPHeaders: { blobContentType: "application/json; charset=utf-8" }, | ||
| conditions: { ifNoneMatch: "*" }, | ||
| }); |
There was a problem hiding this comment.
blob.upload(content, content.length, ...) uses JavaScript string length (UTF-16 code units), which can differ from the actual byte size sent over the wire. This can cause Azure Storage to reject the upload or truncate when the JSON contains non-ASCII characters. Use Buffer.byteLength(content, 'utf8') (or upload via uploadData(Buffer.from(content))) to provide the correct length.
| const blob = container.getBlockBlobClient(normalizePath(joinUnix(pkgName, "latest.json"))); | ||
| const content = JSON.stringify(index); | ||
| await blob.upload(content, content.length, { | ||
| blobHTTPHeaders: { blobContentType: "application/json; charset=utf-8" }, | ||
| }); |
There was a problem hiding this comment.
Same string-length vs byte-length issue as above: blob.upload(content, content.length, ...) should use the UTF-8 byte length (e.g., Buffer.byteLength) or upload a Buffer. Otherwise latest.json uploads can fail or be corrupted for non-ASCII content.
Sync eng/common directory with azure-sdk-tools for PR Azure/azure-sdk-tools#15278 See eng/common workflow