Send to zulip

This commit is contained in:
Koper
2023-11-20 21:39:33 +07:00
parent 82f50817f8
commit ba40d28152
3609 changed files with 2311843 additions and 7 deletions

View File

@@ -0,0 +1,8 @@
Steps to use console:
- Run `npm install` in root
- `cd scripts`
- `./console`
References:
- [Setting Credentials in Node.js](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/setting-credentials-node.html)
- [SDK for JavaScript Code Examples](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/sdk-code-samples.html)

View File

@@ -0,0 +1,97 @@
# AWS SDK for JavaScript Changelog Scripts
These scripts create and update the changelog to summarize what has changed in
each version of the AWS SDK for JavaScript.
Here is a sample of what an entry in the changelog looks like:
## 2.4.5
* bugfix: Waiters: Some description of the bugfix ([Issue #9542]())
* feature: S3: Some descripton of the new feature
* API: RDS: Some description
* API: DynamoDB: Some description
Here is an overview of the scripts that create and update the changelog:
### create-changelog
This script can be used to create or recreate the changelog based on JSON files
in the `.changes/` directory in the root directory of the SDK. This does not
need to be run on a regular basis but is useful if the changelog accidentallly
gets deleted or corrupted. A `.changes/` directory in the root directory will
need to be created before running this script, if it does not already exist. To
run this script, type the following command from the root SDK directory in the
command line:
```
./scripts/changelog/create-changelog
```
The JSON files in the `.changes/` directory must be named with a version number
(e.g. `2.4.5.json`) and its contents should be an array of objects. Each object
represents one change in that version of the SDK, and should contain `"type"`,
`"category"`, and `"description"` properties with string values. Incorrectly
formatted filenames will be skipped over, and incorrectly formatted JSON within
files with correctly formatted names will cause an error to be thrown and halt
the execution of this script. The changelog file is not written to until the
end, so if execution is halted, no files will have changed and no cleanup is
required. The JSON files in `.changes/` are created in the `release` script.
### release
This script should be run for each release. It creates a new entry in the
changelog based on JSON files in the `next-release/` directory in the
`.changes/` directory in the root of the SDK. In addition, it will create a
JSON file for the new version in the `.changes/` directory so that the entry
can be recreated when the `create-changelog` script is run. The `.changes/` and
`next-release/` directories will need to be created before running this script,
if they do not already exist. To run this script, type the following command
from the root SDK directory in the command line:
```
./scripts/changelog/release
```
Optionally, you can provide an argument to specify the version number of the
new release. Accepted values are `major`, `minor`, `patch`, or a version number
that is greater than the latest version (e.g. `2.4.6`). An error will be thrown
if the specified version is not greater than the latest version, and execution
will be halted. The former 3 choices specifies the type of version bump. For
example, running
```
./scripts/changelog/release minor
```
will bump up the minor version from the latest version. If the latest version
is `2.4.5`, then this would set the new version to `2.5.0`. If no argument is
provided, then the script defaults to bumping the patch number.
The JSON files in the `next-release/` directory can either contain a single
object or an array of objects. Each object represents one change in the new
version, and should contain `"type"`, `"category"`, and `"description"`
properties with string values. Incorrectly formatted JSON will cause an error
to be thrown and halt execution of this script. If execution is halted due to
this error, no changes will have been made to any files yet at this point, so
no cleanup will be required.
The script merges all changes in `next-release/` to a new JSON file with the
version number as its name, and files in `next-release/` are deleted. A new
entry is then created in the changelog. If for any reason execution is halted
after `next-release/` is cleared but before changes are written to the
changelog, you can either just run the `create-changelog` script or you can
move the new version JSON file into `next-release/` and re-run the `release`
script (the name of the file does not matter).
### add-change cli
This script creates a changelog entry. The script prompts you to
specify a `type` (e.g. bugfix or feature), a `category` (e.g. a service name
or something like: Paginator), and a short `description` describing the change.
Type the following command from the root SDK directory in the command line to
run this script, using versions of node.js that support promises (0.12.x and higher):
```
node ./scripts/changelog/add-change.js
```
This script will place a JSON file representing your change in the following location:
```
$SDK_ROOT/.changes/next-release/
```
Please run this script and include the JSON file when submitting a pull request.

View File

@@ -0,0 +1,239 @@
var ChangeCreator = require('./change-creator').ChangeCreator;
/**
* The CLI class to add a changelog entry.
*/
function AddChangeCli() {
this._changeCreator = new ChangeCreator();
this._maxRetries = 2;
this._retryCount = 0;
}
AddChangeCli.prototype = {
/**
* Prints a string to stdout.
* @param {string} message - text to print.
*/
print: function print(message) {
process.stdout.write(message);
},
/**
* Prints the CLI intro message.
*/
showIntro: function showIntro() {
var intro = '\n';
intro += 'This utility will walk you through creating a changelog entry.\n\n';
intro += 'A changelog entry requires:\n';
intro += '\t- type: Type should be one of: feature, bugfix.\n';
intro += '\t- category: This can be a service identifier (e.g. "s3"), or something like: Paginator.\n';
intro += '\t- description: A brief description of the change.\n';
intro += '\t You can also include a github style reference such as "#111".\n\n'
intro += 'Please run this script before submitting a pull request.\n\n';
intro += 'Press ^C at any time to quit.\n';
this.print(intro);
},
/**
* Gets a string from stdin and returns a promise resolved with the string.
* Note: stdin is read when the user presses 'Enter'.
* Returns a promise that is resolved with the trimmed user input.
*/
retrieveInputAsync: function retrieveInput() {
return new Promise(function(resolve, reject) {
function getData() {
var chunk = process.stdin.read();
if (chunk !== null) {
// Remove self from stdin and call callback
process.stdin.removeListener('readable', getData);
resolve(chunk.trim());
}
}
process.stdin.setEncoding('utf8');
// start listening for input
process.stdin.on('readable', getData);
});
},
/**
* Prompts the user to enter a type.
* Will also process the user input.
* Returns a promise.
*/
promptType: function promptType() {
var changeCreator = this._changeCreator;
var existingType = changeCreator.getChangeType();
this.print('\nValid types are "feature" or "bugfix"\n');
this.print('type: ' + (existingType ? '(' + existingType + ') ' : ''));
return this.retrieveInputAsync()
.then(this.processType.bind(this));
},
/**
* Prompts the user to enter a category.
* Will also process the user input.
* Returns a promise.
*/
promptCategory: function promptCategory() {
var changeCreator = this._changeCreator;
var existingCategory = changeCreator.getChangeCategory();
this.print('\nCategory can be a service identifier or something like: Paginator\n');
this.print('category: ' + (existingCategory ? '(' + existingCategory + ') ' : ''));
return this.retrieveInputAsync()
.then(this.processCategory.bind(this));
},
/**
* Prompts the user to enter a description.
* Will also process the user input.
* Returns a promise.
*/
promptDescription: function promptDescription() {
var changeCreator = this._changeCreator;
var existingDescription = changeCreator.getChangeDescription();
this.print('\nA brief description of your change.\n');
this.print('description: ' + (existingDescription ? '(' + existingDescription + ') ' : ''));
return this.retrieveInputAsync()
.then(this.processDescription.bind(this));
},
/**
* Handles processing of `type` based on user input.
* If validation of `type` fails, the prompt will be shown again up to 3 times.
* Returns a promise.
*/
processType: function processType(type) {
var changeCreator = this._changeCreator;
var type = type.toLowerCase();
// validate
try {
if (type) {
changeCreator.setChangeType(type);
}
changeCreator.validateChangeType(type);
} catch (err) {
// Log the error
this.print(err.message + '\n');
// re-prompt if we still have retries
if (this._retryCount < this._maxRetries) {
this._retryCount++;
return this.promptType();
}
//otherwise, just exit
return Promise.reject();
}
// reset retry count
this._retryCount = 0;
return Promise.resolve();
},
/**
* Handles processing of `category` based on user input.
* If validation of `category` fails, the prompt will be shown again up to 3 times.
* Returns a promise.
*/
processCategory: function processCategory(category) {
var changeCreator = this._changeCreator;
// validate
try {
if (category) {
changeCreator.setChangeCategory(category);
}
changeCreator.validateChangeCategory(category);
} catch (err) {
// Log the error
this.print(err.message + '\n');
// re-prompt if we still have retries
if (this._retryCount < this._maxRetries) {
this._retryCount++;
return this.promptCategory();
}
//otherwise, just exit
return Promise.reject();
}
// reset retry count
this._retryCount = 0;
return Promise.resolve();
},
/**
* Handles processing of `description` based on user input.
* If validation of `description` fails, the prompt will be shown again up to 3 times.
* Returns a promise.
*/
processDescription: function processDescription(description) {
var changeCreator = this._changeCreator;
// validate
try {
if (description) {
changeCreator.setChangeDescription(description);
}
changeCreator.validateChangeDescription(description);
} catch (err) {
// Log the error
this.print(err.message + '\n');
// re-prompt if we still have retries
if (this._retryCount < this._maxRetries) {
this._retryCount++;
return this.promptDescription();
}
//otherwise, just exit
return Promise.reject();
}
// reset retry count
this._retryCount = 0;
return Promise.resolve();
},
/**
* Prompts the user for all inputs.
* Returns a promise.
*/
promptInputs: function promptInputs() {
var self = this;
return this.promptType()
.then(this.promptCategory.bind(this))
.then(this.promptDescription.bind(this))
.catch(function(err) {
self.print(err.message);
});
},
/**
* Writes the changelog entry to a JSON file.
* Returns a promise that is resolved with the output filename.
*/
writeChangeEntry: function writeChangeEntry() {
var self = this;
return new Promise(function(resolve, reject) {
var changeCreator = self._changeCreator;
changeCreator.writeChanges(function(err, data) {
if (err) {
return reject(err);
}
self.print('\nFile created at ' + data.file + '\n');
return resolve(data);
});
});
}
};
// Run the CLI program
var cli = new AddChangeCli();
cli.showIntro();
cli.promptInputs()
.then(cli.writeChangeEntry.bind(cli))
.then(function() {
// CLI done with its work, exit successfully.
setTimeout(function() {
process.exit(0)
}, 0);
})
.catch(function(err) {
cli.print(err.message);
cli.print('\nExiting...\n');
setTimeout(function() {
// CLI failed, exit with an error
process.exit(1);
}, 0);
});

View File

@@ -0,0 +1,222 @@
var fs = require('fs');
var path = require('path');
var crypto = require('crypto');
/**
* Configuration Options:
* - write file location
* - read file location
*/
function checkProperty(obj, prop) {
return Object.prototype.hasOwnProperty.call(obj, prop);
}
/**
* Generates a 'random' hex value.
* More 'random' than Math.random without depending on a GUID module.
*/
function generateRandomIdentifier() {
return crypto.randomBytes(4).toString('hex');
}
/**
* Escapes illegal characters from filename
* Ref: https://github.com/aws/aws-sdk-js/issues/3691
*/
function sanitizeFileName(filename) {
return filename.replace(/[^a-zA-Z0-9\\.]/g, '-');
}
var CHANGES_DIR = path.join(process.cwd(), '.changes');
/**
* A map of valid change types.
* Can be referenced outside of this module.
*/
var VALID_TYPES = Object.create(null);
VALID_TYPES['bugfix'] = true;
VALID_TYPES['feature'] = true;
/**
* Handles creating a change log entry JSON file.
*/
function ChangeCreator(config) {
this._config = config || {};
this._type = '';
this._category = '';
this._description = '';
}
ChangeCreator.prototype = {
getChangeType: function getChangeType() {
return this._type;
},
setChangeType: function setChangeType(type) {
this._type = type;
},
getChangeCategory: function getChangeCategory() {
return this._category;
},
setChangeCategory: function setChangeCategory(category) {
this._category = category;
},
getChangeDescription: function getChangeDescription() {
return this._description;
},
setChangeDescription: function setChangeDescription(description) {
this._description = description;
},
/**
* Validates the entire change entry.
*/
validateChange: function validateChange() {
var type = this.getChangeType();
var category = this.getChangeCategory();
var description = this.getChangeDescription();
var missingFields = [];
this.validateChangeType(type);
this.validateChangeCategory(category);
this.validateChangeDescription(description);
return this;
},
/**
* Validates a change entry type.
*/
validateChangeType: function validateChangeType(type) {
var type = type || this._type;
if (!type) {
throw new Error('ValidationError: Missing `type` field.');
}
if (VALID_TYPES[type]) {
return this;
}
var validTypes = Object.keys(VALID_TYPES).join(',');
throw new Error('ValidationError: `type` set as "' + type + '" but must be one of [' + validTypes + '].');
},
/**
* Validates a change entry category.
*/
validateChangeCategory: function validateChangeCategory(category) {
var category = category || this._category;
if (!category) {
throw new Error('ValidationError: Missing `category` field.');
}
return this;
},
/**
* Validates a change entry description.
*/
validateChangeDescription: function validateChangeDescription(description) {
var description = description || this._description;
if (!description) {
throw new Error('ValidationError: Missing `description` field.');
}
return this;
},
/**
* Creates the output directory if it doesn't exist.
*/
createOutputDirectory: function createOutputDirectory(outPath) {
var pathObj = path.parse(outPath);
var sep = path.sep;
var directoryStructure = pathObj.dir.split(sep) || [];
for (var i = 0; i < directoryStructure.length; i++) {
var pathToCheck = directoryStructure.slice(0, i + 1).join(sep);
if (!pathToCheck) {
continue;
}
try {
var stats = fs.statSync(pathToCheck);
} catch (err) {
if (err.code === 'ENOENT') {
// Directory doesn't exist, so create it
fs.mkdirSync(pathToCheck);
} else {
throw err;
}
}
}
return this;
},
/**
* Returns a path to the future change entry file.
*/
determineWriteLocation: function determineWriteLocation() {
/* Order for determining write location:
1) Check configuration for `outFile` location.
2) Check configuration for `inFile` location.
3) Create a new file using default location.
*/
var config = this._config || {};
if (checkProperty(config, 'outFile') && config['outFile']) {
return config['outFile'];
}
if (checkProperty(config, 'inFile') && config['inFile']) {
return config['inFile'];
}
// Determine default location
var newFileName = sanitizeFileName(this._type) + '-' + sanitizeFileName(this._category)
+ '-' + generateRandomIdentifier() + '.json';
return path.join(process.cwd(), '.changes', 'next-release', newFileName);
},
/**
* Writes a change entry as a JSON file.
*/
writeChanges: function writeChanges(callback) {
var hasCallback = typeof callback === 'function';
var fileLocation = this.determineWriteLocation();
try {
// Will throw an error if the change is not valid
this.validateChange().createOutputDirectory(fileLocation);
var change = {
type: this.getChangeType(),
category: this.getChangeCategory(),
description: this.getChangeDescription()
}
fs.writeFileSync(fileLocation, JSON.stringify(change, null, 2));
var data = {
file: fileLocation
};
if (hasCallback) {
return callback(null, data);
} else {
return data;
}
} catch (err) {
if (hasCallback) {
return callback(err, null);
} else {
throw err;
}
}
}
}
module.exports = {
ChangeCreator: ChangeCreator,
VALID_TYPES: VALID_TYPES
};

View File

@@ -0,0 +1,11 @@
#!/usr/bin/env node
var util = require('./util');
var versions = util.listVersions();
util.startNewChangelog();
versions.forEach(util.readVersionJSONAndAddToChangelog);
util.writeToChangelog();

View File

@@ -0,0 +1,35 @@
#!/usr/bin/env node
var util = require('./util');
var input = process.argv[2];
var version;
switch (input) {
case 'major':
version = util.bumpMajor();
break;
case 'minor':
version = util.bumpMinor();
break;
case 'patch':
case undefined:
version = util.bumpPatch();
break;
default:
version = util.checkAndNormalizeVersion(input);
}
var nextReleaseFiles = util.listNextReleaseFiles();
var versionJSON = nextReleaseFiles.reduce(function(changes, filepath) {
return changes.concat(util.readChangesFromJSON(filepath));
}, []);
util.writeToVersionJSON(version, versionJSON);
util.clearNextReleaseDir();
util.addVersionJSONToChangelog(version, versionJSON);
util.writeToChangelog();

View File

@@ -0,0 +1,195 @@
var fs = require('fs');
var changelog, latest, nextReleaseFiles;
var changelogFile = process.cwd() + '/CHANGELOG.md';
var changesDir = process.cwd() + '/.changes/';
var nextReleaseDir = changesDir + 'next-release/';
var insertMarker = '<!--ENTRYINSERT-->';
var versionMarker = ['<!--LATEST=', '-->'];
var startContent = '# Changelog for AWS SDK for JavaScript\n' +
versionMarker.join('0.0.0') + '\n' + insertMarker;
var versionRegStr = '(\\d+)\\.(\\d+)\\.(\\d+)';
var versionReg = new RegExp('^' + versionRegStr + '$');
var versionMarkerReg = new RegExp(versionMarker.join(versionRegStr));
var versionJsonFileReg = new RegExp('^' + versionRegStr + '\\.json$');
function fsSyncFromRoot(operation, fileOrDir) {
try {
var result = fs[operation + 'Sync'](fileOrDir);
} catch(err) {
if (err.code === 'ENOENT') {
err.message += '. Make sure to run from sdk root directory'
}
throw err;
}
return result;
}
function readChangelog() {
changelog = fsSyncFromRoot('readFile', changelogFile).toString();
return changelog;
}
function getLatestVersion() {
if (!changelog) readChangelog();
var match = changelog.match(versionMarkerReg);
latest = {
major: parseInt(match[1],10),
minor: parseInt(match[2],10),
patch: parseInt(match[3],10)
};
return latest;
}
function checkAndNormalizeVersion(version) {
if (!latest) getLatestVersion();
var match = version.match(versionReg);
if (match) {
// convert to num for comparison and for normalizing leading zeros
var major = parseInt(match[1], 10);
var minor = parseInt(match[2], 10);
var patch = parseInt(match[3], 10);
if (major < latest.major ||
major == latest.major && minor < latest.minor ||
major == latest.major && minor == latest.minor && patch <= latest.patch) {
throw new Error('Version must be greater than latest version');
}
return major + '.' + minor + '.' + patch;
} else {
throw new Error('Provided input version is in wrong format');
}
}
function bumpMajor() {
if (!latest) getLatestVersion();
return (latest.major + 1) + '.0.0';
}
function bumpMinor() {
if (!latest) getLatestVersion();
return latest.major + '.' + (latest.minor + 1) + '.0';
}
function bumpPatch() {
if (!latest) getLatestVersion();
return latest.major + '.' + latest.minor + '.' + (latest.patch + 1);
}
function listVersions() {
var changeFiles = fsSyncFromRoot('readdir', changesDir);
return changeFiles
.map(function(file) { return file.match(versionJsonFileReg); })
.filter(function(version) { return !!version; })
.sort(function(v1, v2) {
var diff;
for (var i = 1; i <= 3; i++) {
diff = v1[i] - v2[i];
if (diff !== 0) {
return diff;
}
}
return 0;
})
.map(function(version) { return version.slice(1).join('.'); });
}
function listNextReleaseFiles() {
nextReleaseFiles = fsSyncFromRoot('readdir', nextReleaseDir)
.map(function(file) { return nextReleaseDir + file });
if (!nextReleaseFiles.length) throw new Error('No changes to be released');
return nextReleaseFiles;
}
function startNewChangelog() {
changelog = startContent;
return changelog;
}
function checkChangeFormat(change) {
if (!change.type || !change.category || !change.description ||
typeof change.type !== 'string' ||
typeof change.category !== 'string' ||
typeof change.description !== 'string') {
var err = new Error('JSON not in correct format');
err.code = 'InvalidFormat';
throw err;
}
}
function readChangesFromJSON(filepath) {
var changes = JSON.parse(fsSyncFromRoot('readFile', filepath));
if (!Array.isArray(changes)) changes = [changes];
if (!changes.length) throw new Error(filepath + ' contains no changes');
try {
changes.forEach(checkChangeFormat);
} catch (err) {
if (err.code === 'InvalidFormat') {
err.message += ' in ' + filepath;
}
throw err;
}
return changes;
}
// This will not to write to file
// writeToChangelog must be called after
function addVersionJSONToChangelog(version, changes) {
if (!changelog) readChangelog();
var entry = '\n\n## ' + version;
changes.forEach(function(change) {
entry += '\n* ' + change.type + ': ' + change.category + ': ' +
change.description;
});
var logParts = changelog.split(insertMarker);
logParts[0] = logParts[0]
.replace(versionMarkerReg, versionMarker.join(version)) + insertMarker;
changelog = logParts.join(entry);
}
// This will not to write to file
// writeToChangelog must be called after
function readVersionJSONAndAddToChangelog(version) {
var changes = readChangesFromJSON(changesDir + version + '.json');
addVersionJSONToChangelog(version, changes);
}
function writeToChangelog() {
if (!changelog) throw new Error('Nothing to write');
fs.writeFileSync(changelogFile, changelog);
console.log('Successfully updated CHANGELOG');
}
function writeToVersionJSON(version, json) {
var content = JSON.stringify(json, null, 4);
fs.writeFileSync(changesDir + version + '.json', content);
console.log('Successfully added ' + version + '.json to ' + changesDir);
}
function clearNextReleaseDir() {
if (!nextReleaseFiles) listNextReleaseFiles();
nextReleaseFiles.forEach(function(filepath) {
fsSyncFromRoot('unlink', filepath);
});
console.log(nextReleaseDir + ' has been cleared');
}
module.exports = {
readChangelog: readChangelog,
getLatestVersion: getLatestVersion,
checkAndNormalizeVersion: checkAndNormalizeVersion,
bumpMajor: bumpMajor,
bumpMinor: bumpMinor,
bumpPatch: bumpPatch,
listVersions: listVersions,
listNextReleaseFiles: listNextReleaseFiles,
startNewChangelog: startNewChangelog,
readChangesFromJSON: readChangesFromJSON,
addVersionJSONToChangelog: addVersionJSONToChangelog,
readVersionJSONAndAddToChangelog: readVersionJSONAndAddToChangelog,
writeToChangelog: writeToChangelog,
writeToVersionJSON: writeToVersionJSON,
clearNextReleaseDir: clearNextReleaseDir
};

View File

@@ -0,0 +1,37 @@
const {execute, executeLongProcess} = require('./lib/test-helper');
async function run() {
const EXEC = {
'execute': execute,
'executeLongProcess': executeLongProcess,
}
const scripts = [
{ execute: 'executeLongProcess', command: ['npm', 'run', 'helper-test'], retryCount: 1 },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'lint']},
{ execute: 'executeLongProcess', command: ['npm', 'run', 'coverage'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'buildertest'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'tstest'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'region-check'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'translate-api-test'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'typings-generator-test'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'browsertest'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'react-native-test'] },
{ execute: 'executeLongProcess', command: ['npm', 'run', 'csm-functional-test'] }
];
for (const { execute, command, execOptions, retryCount } of scripts) {
try {
await EXEC[execute](command, execOptions, retryCount);
} catch (error) {
throw error;
}
}
}
(async () => {
try {
await run();
} catch (e) {
console.log(e);
process.exit(1);
}
})();

View File

@@ -0,0 +1,119 @@
#!/usr/bin/env node
var repl = require('repl').start('aws-sdk> '),
replEval = repl.eval,
defaultOptions = {
logger: process.stdout,
region: process.env.AWS_REGION || 'us-east-1'
};
function customEval(cmd, context, filename, callback) {
replEval(cmd, context, filename, function(err, value) {
if (err) {
callback(err, null);
return;
}
function consoleDataExtraction(resp) {
context.data = resp.data;
context.error = resp.error;
callback(resp.error, resp.data);
}
if (value && value.constructor === AWS.Request && !value.__hasBeenEval__) {
try {
value.__hasBeenEval__ = true;
if (value.response) value.response.__hasBeenEval__ = true;
context.request = value;
context.response = value.response || null;
context.data = null;
context.error = null;
if (value._asm.currentState === 'complete' && value.response) {
context.data = value.response.data || null;
context.error = value.response.error || null;
callback(value.response.error, value.response.data);
} else {
value.on('complete', consoleDataExtraction);
if (!value.__hasBeenSent__) {
if (context.autoSend) {
value.send();
} else {
callback(null, value);
}
}
}
} catch (err2) {
callback(err2, null);
return;
}
} else if (value && value.constructor === AWS.Response && !value.__hasBeenEval__) {
try {
value.__hasBeenEval__ = true;
context.response = value;
context.request = value.request || null;
context.data = value.data || null;
context.error = value.error || null;
if (value.request) {
value.request.__hasBeenEval__ = true;
if (value.request._asm.currentState === 'complete') {
callback(value.error, value.data);
} else {
value.request.on('complete', consoleDataExtraction);
}
}
} catch (err2) {
callback(err2, null);
return;
}
} else {
callback(null, value);
}
});
}
var AWS = repl.context.AWS = require('../lib/aws');
repl.eval = customEval;
// context variables
repl.context.data = null;
repl.context.error = null;
repl.context.request = null;
repl.context.response = null;
// setup REPL history
try {
var replHistory = require('repl.history');
replHistory(repl, process.env.HOME + '/.node_history');
} catch (e) {
console.log("Missing repl.history package, history will not be supported.");
console.log(" Type `npm install repl.history` to enable history.");
}
// modify Request.prototype.send listener to track if the listener has been called
var sendListener = AWS.Request.prototype.send;
AWS.Request.prototype.send = function(callback) {
this.__hasBeenSent__ = true;
return sendListener.call(this, callback);
};
// flag to indicate that requests should be sent when callback is not provided
// by default this is on, but can be turned off by setting `autoSend = false`
repl.context.autoSend = true;
// load services as defined instances
for (var key in AWS) {
var id = AWS[key].serviceIdentifier;
if (id) {
if (id === 'cloudsearchdomain' || id === 'iotdata') continue; // this required an explicit endpoint
var svcClass = AWS[key];
var svc = new svcClass(defaultOptions);
svc.with = function(config) {
return new this.constructor.__super__(AWS.util.merge(this.config, config));
};
repl.context[svcClass.serviceIdentifier] = svc;
}
}

View File

@@ -0,0 +1,179 @@
{
"version": "2.0",
"metadata": {
"apiVersion": "2018-03-30",
"endpointPrefix": "foo",
"protocol": "rest-json",
"serviceId": "Foo",
"uid": "foo-2018-03-30"
},
"operations": {
"FancyOperation": {
"name" :"FancyOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"input": {
"shape": "FancyStructure"
}
},
"BarOperation": {
"name": "BarOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"input": {
"shape": "String"
},
"output": {
"shape": "String"
}
},
"EventStreamOnInputOperation": {
"name": "EventStreamOnInputOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"input": {
"shape": "String"
}
},
"EventStreamOnInputPayloadOperation": {
"name": "EventStreamOnInputPayloadOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"input": {
"shape": "String"
}
},
"EventStreamOnOutputOperation": {
"name": "EventStreamOnOutputOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"output": {
"shape": "String"
}
},
"EventStreamOnOutputPayloadOperation": {
"name": "EventStreamOnOutputPayloadOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"output": {
"shape": "String"
}
},
"BazOperation": {
"name": "BazOperation",
"http": {
"method": "GET",
"requestUri": "/"
},
"input": {
"shape": "String"
}
}
},
"shapes": {
"FancyStructure": {
"type": "structure",
"required": [],
"members": {
"Map": {
"shape": "MapOfList"
}
}
},
"ListOfString": {
"type": "list",
"member": {
"shape": "String"
}
},
"ListOfList": {
"type": "list",
"member": {
"shape": "ListOfString"
}
},
"MapOfString": {
"type": "map",
"key": {
"shape": "String"
},
"value": {
"shape": "String"
}
},
"MapOfList": {
"type": "map",
"key": {
"shape": "String"
},
"value": {
"shape": "ListOfString"
}
},
"String": {
"type": "string"
},
"BarOperationInput": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
}
},
"BarOperationOutput": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
}
},
"BazOperationInput": {
"type": "structure",
"members": {
"BazString": {
"shape": "BazStringShape",
"timestampFormat": "iso8601"
}
}
},
"EventStreamPayload": {
"type": "structure",
"members": {
"Payload": {
"shape": "EventStreamStructure"
},
"payload": "Payload"
}
},
"EventStreamStructure": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
},
"eventstream": true
},
"BazStringShape": {
"type": "timestamp",
"timestampFormat": "rfc822"
},
"StringShape": {
"type": "string"
}
}
}

View File

@@ -0,0 +1,127 @@
{
"version": "2.0",
"metadata": {
"apiVersion": "2018-03-30",
"endpointPrefix": "foo",
"protocol": "rest-json",
"serviceId": "Foo",
"uid": "foo-2018-03-30"
},
"operations": {
"BarOperation": {
"name": "BarOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"input": {
"shape": "BarOperationInput"
},
"output": {
"shape": "BarOperationOutput"
}
},
"EventStreamOnInputOperation": {
"name": "EventStreamOnInputOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"input": {
"shape": "EventStreamStructure"
}
},
"EventStreamOnInputPayloadOperation": {
"name": "EventStreamOnInputPayloadOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"input": {
"shape": "EventStreamPayload"
}
},
"EventStreamOnOutputOperation": {
"name": "EventStreamOnOutputOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"output": {
"shape": "EventStreamStructure"
}
},
"EventStreamOnOutputPayloadOperation": {
"name": "EventStreamOnOutputPayloadOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"output": {
"shape": "EventStreamPayload"
}
},
"BazOperation": {
"name": "BazOperation",
"http": {
"method": "GET",
"requireUri": "/"
},
"input": {
"shape": "BazOperationInput"
}
}
},
"shapes": {
"BarOperationInput": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
}
},
"BarOperationOutput": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
}
},
"BazOperationInput": {
"type": "structure",
"members": {
"BazString": {
"shape": "BazStringShape",
"timestampFormat": "iso8601"
}
}
},
"EventStreamPayload": {
"type": "structure",
"members": {
"Payload": {
"shape": "EventStreamStructure"
},
"payload": "Payload"
}
},
"EventStreamStructure": {
"type": "structure",
"members": {
"String": {
"shape": "StringShape"
}
},
"eventstream": true
},
"BazStringShape": {
"type": "timestamp",
"timestampFormat": "rfc822"
},
"StringShape": {
"type": "string"
}
}
}

View File

@@ -0,0 +1,20 @@
function getOperationShapeNames(model) {
var operationShapeNames = [];
var operations = model.operations;
for (var operationName of Object.keys(operations)) {
var operation = operations[operationName];
if (operation.input && operation.input.shape) {
operationShapeNames.push(operation.input.shape);
}
if (operation.output && operation.output.shape) {
operationShapeNames.push(operation.output.shape);
}
}
return operationShapeNames;
};
module.exports = {
getOperationShapeNames: getOperationShapeNames
};

View File

@@ -0,0 +1,27 @@
var getOperationShapeNames = require('./get-operation-shape-names').getOperationShapeNames;
var visitRelatedShapeNames = require('./visit-related-shape-names').visitRelatedShapeNames;
function pruneShapes(model) {
// start by grabbing the input/output shapes on all operations
var operationShapeNames = getOperationShapeNames(model);
var shapeMap = model.shapes;
for (operationShape of operationShapeNames) {
// traverse the tree and store visited shapes
visitRelatedShapeNames(operationShape, shapeMap);
}
// iterate over the shapeMap and remove any shape that wasn't visited
var shapeNames = Object.keys(shapeMap);
for (var name of shapeNames) {
if (!shapeMap[name].visited) {
delete shapeMap[name];
}
}
};
module.exports = {
pruneShapes: pruneShapes
};

View File

@@ -0,0 +1,54 @@
/**
* Removes operations from the model if they require event streams.
* Specifically looks at input and output shapes.
* @param {Object} model - JSON parsed API model (*.normal.json)
*/
function removeEventStreamOperations(model) {
var modifiedModel = false;
// loop over all operations
var operations = model.operations;
var operationNames = Object.keys(operations);
for (var i = 0; i < operationNames.length; i++) {
var operationName = operationNames[i];
var operation = operations[operationName];
// check input and output shapes
var inputShapeName = operation.input && operation.input.shape;
var outputShapeName = operation.output && operation.output.shape;
var requiresEventStream = false;
if (inputShapeName && hasEventStream(model.shapes[inputShapeName], model)) {
requiresEventStream = true;
}
if (requiresEventStream) {
modifiedModel = true;
// remove the operation from the model
console.log('Removing ' + operationName + ' because it depends on event streams on input.');
delete model.operations[operationName];
}
}
return modifiedModel;
}
function hasEventStream(shape, model) {
if (shape.eventstream) {
return true;
} else {
// check each member shape
var memberNames = Object.keys(shape.members);
for (var i = 0; i < memberNames.length; i++) {
var member = shape.members[memberNames[i]];
if (member.eventstream) {
return true;
}
var memberShape = model.shapes[member.shape];
if (memberShape.eventstream) {
return true;
}
}
}
}
module.exports = {
removeEventStreamOperations: removeEventStreamOperations
};

View File

@@ -0,0 +1,85 @@
const { exec, spawn } = require('child_process');
/**
* wrap child_process.exec with retry. Will throw immediately when return
* code not 0; Used for trivial commands since error detail won't be printed.
* @param {string} command
* @param {structure} execOptoins
* @param {number} retryCount
*/
async function executeCommand(command, execOptoins = {}, retryCount = 1) {
try {
const execCommand = command.join(' ');
const { stderr, stdout } = await execute(execCommand, execOptoins);
if (stderr) process.stderr.write(stderr.toString());
if (stdout) process.stderr.write(stdout.toString());
} catch (error) {
if (retryCount > 0) {
await executeCommand(command, execOptoins, --retryCount);
} else {
throw error;
}
}
}
function execute(command, options) {
return new Promise((resolve, reject) => {
exec(command, options, (err, stdout, stderr) => {
if (err) {
reject(err);
} else {
resolve({
stdout: stdout,
stderr: stderr
});
}
});
})
}
/**
* wrap child_process.spawn with retry
* @param {string} command
* @param {structure} execOptions
* @param {number} retryCount
*/
async function executeLongProcessCommand(command, execOptions = {}, retryCount = 1) {
try {
const firstCommand = command[0];
const options = command.slice(1);
await promisifiedSpawn(firstCommand, options, execOptions);
} catch (error) {
if (retryCount > 0) {
await executeLongProcessCommand(command, execOptions, --retryCount);
} else {
throw error;
}
}
}
function promisifiedSpawn(command, options, execOptions) {
return new Promise((resolve, reject) => {
const subProcess = spawn(command, options, execOptions);
subProcess.stdout.on('data', (data) => {
process.stdout.write(data.toString());
});
subProcess.stderr.on('data', (data) => {
process.stderr.write(data.toString());
});
subProcess.on('error', (err) => {
console.error('spawn error: ', err);
});
subProcess.on('close', (code) => {
if (code === 0) {
resolve();
} else {
reject(`"${command} ${options.join(' ')}" exited with code: ${code}`);
}
});
});
}
module.exports = {
execute: executeCommand,
executeLongProcess: executeLongProcessCommand,
}

View File

@@ -0,0 +1,160 @@
/* A couple of utility methods */
function each(obj, iter) {
for (var key in obj) {
if (obj.hasOwnProperty(key)) iter(key, obj[key]);
}
}
function nextString(str) {
return 'S' + (parseInt(str.substr(1), 36) + 1).toString(36);
}
/* End utility methods */
function Translator(api, options) {
var origLength = JSON.stringify(api, null, 2).length;
var debugInfo = {flattened: {}, pruned: {}};
var shapeName = 'S0';
var shapeNameMap = {};
var visitedShapes = {};
function logResults() {
console.log('** Generated', api.metadata.endpointPrefix + '-' +
api.metadata.apiVersion +'.min.json' +
(process.env.DEBUG ? ':' : ''));
if (process.env.DEBUG) {
var pruned = Object.keys(debugInfo.pruned);
var flattened = Object.keys(debugInfo.flattened);
var newLength = JSON.stringify(api, null, 2).length;
console.log('- Pruned Shapes:', pruned.length);
console.log('- Flattened Shapes:', flattened.length);
console.log('- Remaining Shapes:', Object.keys(api.shapes).length);
console.log('- Original Size:', origLength / 1024.0, 'kb');
console.log('- Minified Size:', newLength / 1024.0, 'kb');
console.log('- Size Saving:', (origLength - newLength) / 1024.0, 'kb');
console.log('');
}
}
function deleteTraits(obj) {
if (!options.documentation) {
delete obj.documentation;
delete obj.documentationUrl;
delete obj.errors;
delete obj.min;
delete obj.max;
delete obj.pattern;
delete obj['enum'];
delete obj.box;
}
}
function trackShapeDeclaration(ref) {
if (ref.shape && !shapeNameMap[ref.shape]) {
// found a shape declaration we have not yet visited.
// assign a new generated name in the shapeNameMap & visit it
var oldShapeName = ref.shape;
ref.shape = shapeName = nextString(shapeName);
visitedShapes[shapeName] = api.shapes[oldShapeName];
shapeNameMap[oldShapeName] = {name: shapeName, refs: [ref]};
traverseShapeRef(api.shapes[oldShapeName]);
} else if (ref.shape && shapeNameMap[ref.shape]) {
// we visited this shape before. keep track of this ref and rename
// the referenced declaration to the generated name
var map = shapeNameMap[ref.shape];
map.refs.push(ref);
ref.shape = map.name;
}
}
function pruneShapes() {
// prune shapes visited only once or only have type specifiers
each(shapeNameMap, function(name, map) {
if (Object.keys(visitedShapes[map.name]).join() === 'type' &&
['structure','map','list'].indexOf(visitedShapes[map.name].type) < 0) {
// flatten out the shape (only a scalar type property is on the shape)
for (var i = 0; i < map.refs.length; i++) {
var ref = map.refs[i];
debugInfo.flattened[name] = true;
delete ref.shape;
ref.type = visitedShapes[map.name].type;
// string type is default, don't need to specify this
if (ref.type === 'string') delete ref.type;
}
// we flattened all refs, we can prune the shape too
delete visitedShapes[map.name];
debugInfo.pruned[name] = true;
} else if (map.refs.length === 1) { // only visited once
// merge shape data onto ref
var shape = visitedShapes[map.name];
for (var i = 0; i < map.refs.length; i++) {
delete map.refs[i].shape;
for (var prop in shape) {
if (shape.hasOwnProperty(prop)) {
//Translator prefers timestampFormat trait in members rather than in shape
if (map.refs[i].hasOwnProperty(prop) && ['timestampFormat'].indexOf(prop) >= 0) {
continue;
}
map.refs[i][prop] = shape[prop];
}
}
}
// delete the visited shape
delete visitedShapes[map.name];
debugInfo.pruned[name] = true;
}
});
}
function traverseShapeRef(ref) {
if (!ref) return;
deleteTraits(ref);
traverseShapeRef(ref.key); // for maps
traverseShapeRef(ref.value); // for maps
traverseShapeRef(ref.member); // for lists
// for structures
each(ref.members || {}, function(key, value) { traverseShapeRef(value); });
// resolve shape declarations
trackShapeDeclaration(ref);
}
function traverseOperation(op) {
deleteTraits(op);
delete op.name;
if (op.http) {
if (op.http.method === 'POST') delete op.http.method;
if (op.http.requestUri === '/') delete op.http.requestUri;
if (Object.keys(op.http).length === 0) delete op.http;
}
traverseShapeRef(op.input);
traverseShapeRef(op.output);
}
function traverseApi() {
deleteTraits(api);
each(api.operations, function(name, op) { traverseOperation(op); });
api.shapes = visitedShapes;
}
traverseApi();
pruneShapes();
logResults();
return api;
}
module.exports = Translator;

View File

@@ -0,0 +1,75 @@
{
"cloudfront": [
{
"path": "lib/cloudfront/signer",
"imports": [
{
"name": "Signer",
"alias": "signer"
}
]
}
],
"dynamodb": [
{
"path": "lib/dynamodb/document_client",
"imports": [
{
"name": "DocumentClient",
"alias": "document_client"
}
]
},
{
"path": "lib/dynamodb/converter",
"imports": [
{
"name": "Converter",
"alias": "converter"
}
]
}
],
"polly": [
{
"path": "lib/polly/presigner",
"imports": [
{
"name": "Presigner",
"alias": "presigner"
}
]
}
],
"rds": [
{
"path": "lib/rds/signer",
"imports": [
{
"name": "Signer",
"alias": "signer"
}
]
}
],
"s3": [
{
"path": "lib/s3/managed_upload",
"imports": [
{
"name": "ManagedUpload",
"alias": "managed_upload"
}
]
},
{
"path": "lib/s3/presigned_post",
"imports": [
{
"name": "PresignedPost",
"alias": "presigned_post"
}
]
}
]
}

View File

@@ -0,0 +1,686 @@
var fs = require('fs');
var path = require('path');
var pruneShapes = require('./prune-shapes').pruneShapes;
var CUSTOM_CONFIG_ENUMS = {
DUALSTACK: {
FILE_NAME: 'config_use_dualstack',
INTERFACE: 'UseDualstackConfigOptions'
}
};
function TSGenerator(options) {
this._sdkRootDir = options.SdkRootDirectory || process.cwd();
this._apiRootDir = path.join(this._sdkRootDir, 'apis');
this._metadataPath = path.join(this._apiRootDir, 'metadata.json');
this._clientsDir = path.join(this._sdkRootDir, 'clients');
// Lazy loading values on usage to avoid side-effects in constructor
this.metadata = null;
this.typings = {};
this.streamTypes = {};
}
/**
* Loads the AWS SDK metadata.json file.
*/
TSGenerator.prototype.loadMetadata = function loadMetadata() {
var metadataFile = fs.readFileSync(this._metadataPath);
this.metadata = JSON.parse(metadataFile);
return this.metadata;
};
/**
* Modifies metadata to include api model filenames.
*/
TSGenerator.prototype.fillApiModelFileNames = function fillApiModelFileNames() {
var modelPaths = fs.readdirSync(this._apiRootDir);
if (!this.metadata) {
this.loadMetadata();
}
var metadata = this.metadata;
// sort paths so latest versions appear first
modelPaths = modelPaths.sort(function sort(a, b) {
if (a < b) {
return 1;
} else if (a > b) {
return -1;
} else {
return 0;
}
});
// Only get latest version of models
var foundModels = Object.create(null);
modelPaths.forEach(function(modelFileName) {
var match = modelFileName.match(/^(.+)(-[\d]{4}-[\d]{2}-[\d]{2})\.normal\.json$/i);
if (match) {
var model = match[1];
// add version
var version = match[2].substring(1);
if (!foundModels[model]) {
foundModels[model] = {
latestFileName: modelFileName,
versions: [version]
};
} else {
foundModels[model].versions.push(version);
}
}
});
// now update the metadata
var keys = Object.keys(metadata);
keys.forEach(function(key) {
var modelName = metadata[key].prefix || key;
var modelInfo = foundModels[modelName];
metadata[key].api_path = modelInfo.latestFileName;
// find waiters file
var baseName = modelInfo.latestFileName.split('.')[0];
if (modelPaths.indexOf(baseName + '.waiters2.json') >= 0) {
metadata[key].waiters_path = baseName + '.waiters2.json';
}
// add versions
if (!metadata[key].versions) {
metadata[key].versions = [];
}
metadata[key].versions = [].concat(metadata[key].versions, modelInfo.versions);
});
};
TSGenerator.prototype.updateDynamoDBDocumentClient = function updateDynamoDBDocumentClient() {
// read in document client customization
var docClientCustomCode = fs.readFileSync(path.join(this._sdkRootDir, 'lib', 'dynamodb', 'document_client.d.ts')).toString();
var lines = docClientCustomCode.split('\n');
var namespaceIndexStart = -1;
var namespaceIndexEnd = -1;
for (var i = 0, iLen = lines.length; i < iLen; i++) {
var line = lines[i];
// find exported namespace
if (line.indexOf('//<!--auto-generated start-->') >= 0) {
namespaceIndexStart = i;
}
if (line.indexOf('//<!--auto-generated end-->') >= 0) {
namespaceIndexEnd = i;
break;
}
}
if (namespaceIndexStart >= 0 && namespaceIndexEnd >= 0) {
// insert doc client interfaces
lines.splice(namespaceIndexStart + 1, (namespaceIndexEnd - namespaceIndexStart - 1), this.generateDocumentClientInterfaces(1));
var code = lines.join('\n');
this.writeTypingsFile('document_client', path.join(this._sdkRootDir, 'lib', 'dynamodb'), code);
}
};
/**
* Generates the file containing DocumentClient interfaces.
*/
TSGenerator.prototype.generateDocumentClientInterfaces = function generateDocumentClientInterfaces(tabCount) {
tabCount = tabCount || 0;
var self = this;
// get the dynamodb model
var dynamodbModel = this.loadServiceApi('dynamodb');
var code = '';
// stub Blob interface
code += this.tabs(tabCount) + 'interface Blob {}\n';
// generate shapes
var modelShapes = dynamodbModel.shapes;
// iterate over each shape
var shapeKeys = Object.keys(modelShapes);
shapeKeys.forEach(function (shapeKey) {
var modelShape = modelShapes[shapeKey];
// ignore exceptions
if (modelShape.exception) {
return;
}
// overwrite AttributeValue
if (shapeKey === 'AttributeValue') {
code += self.generateDocString('A JavaScript object or native type.', tabCount);
code += self.tabs(tabCount) + 'export type ' + shapeKey + ' = any;\n';
return;
}
code += self.generateTypingsFromShape(dynamodbModel, shapeKey, modelShape, tabCount, []);
});
return code;
};
/**
* Returns a service model based on the serviceIdentifier.
*/
TSGenerator.prototype.loadServiceApi = function loadServiceApi(serviceIdentifier) {
// first, find the correct identifier
var metadata = this.metadata;
var serviceFilePath = path.join(this._apiRootDir, metadata[serviceIdentifier].api_path);
var serviceModelFile = fs.readFileSync(serviceFilePath);
var serviceModel = JSON.parse(serviceModelFile);
// load waiters file if it exists
var waiterFilePath;
if (metadata[serviceIdentifier].waiters_path) {
waiterFilePath = path.join(this._apiRootDir, metadata[serviceIdentifier].waiters_path);
var waiterModelFile = fs.readFileSync(waiterFilePath);
var waiterModel = JSON.parse(waiterModelFile);
serviceModel.waiters = waiterModel.waiters;
}
return serviceModel;
};
/**
* Determines if a member is required by checking for it in a list.
*/
TSGenerator.prototype.checkRequired = function checkRequired(list, member) {
if (list.indexOf(member) >= 0) {
return true;
}
return false;
};
/**
* Generates whitespace based on the count.
*/
TSGenerator.prototype.tabs = function tabs(count) {
var code = '';
for (var i = 0; i < count; i++) {
code += ' ';
}
return code;
};
/**
* Transforms documentation string to a more readable format.
*/
TSGenerator.prototype.transformDocumentation = function transformDocumentation(documentation) {
if (!documentation) {
return '';
}
documentation = documentation.replace(/<(?:.|\n)*?>/gm, '');
documentation = documentation.replace(/\*\//g, '*');
return documentation;
};
/**
* Returns a doc string based on the supplied documentation.
* Also tabs the doc string if a count is provided.
*/
TSGenerator.prototype.generateDocString = function generateDocString(documentation, tabCount) {
tabCount = tabCount || 0;
var code = '';
code += this.tabs(tabCount) + '/**\n';
code += this.tabs(tabCount) + ' * ' + this.transformDocumentation(documentation) + '\n';
code += this.tabs(tabCount) + ' */\n';
return code;
};
/**
* Returns an array of custom configuration options based on a service identiffier.
* Custom configuration options are determined by checking the metadata.json file.
*/
TSGenerator.prototype.generateCustomConfigFromMetadata = function generateCustomConfigFromMetadata(serviceIdentifier) {
// some services have additional configuration options that are defined in the metadata.json file
// i.e. dualstackAvailable = useDualstack
// create reference to custom options
var customConfigurations = [];
var serviceMetadata = this.metadata[serviceIdentifier];
// loop through metadata members
for (var memberName in serviceMetadata) {
if (!serviceMetadata.hasOwnProperty(memberName)) {
continue;
}
// check configs
switch (memberName) {
case 'dualstackAvailable':
customConfigurations.push(CUSTOM_CONFIG_ENUMS.DUALSTACK);
break;
}
}
return customConfigurations;
};
TSGenerator.prototype.generateSafeShapeName = function generateSafeShapeName(name, blacklist) {
blacklist = blacklist || [];
if (blacklist.indexOf(name) >= 0) {
return '_' + name;
}
return name;
};
TSGenerator.prototype.extractTypesDependOnStream = function extractTypesDependOnStream(shapeKey, modelShape) {
var streamTypeList = [];
if (typeof modelShape !== "object" || Object.keys(modelShape).length === 0) {
return [];
}
if (modelShape.streaming) {
streamTypeList.push(shapeKey);
return streamTypeList;
}
for (var subModelKey in modelShape) {
var subModel = modelShape[subModelKey];
var subStreamTypeList = this.extractTypesDependOnStream(subModelKey, subModel);
if (Object.keys(subStreamTypeList).length !== 0) {
for (var streamType of subStreamTypeList) {
streamTypeList.push(streamType);
}
}
}
return streamTypeList;
}
TSGenerator.prototype.addReadableType = function addReadableType(shapeKey) {
var code = '';
if (this.streamTypes[shapeKey]) {
code += '|Readable';
} else if (shapeKey[0] === '_' && this.streamTypes[shapeKey.slice(1)]) {
code += '|Readable';
}
return code;
}
/**
* Generates a type or interface based on the shape.
*/
TSGenerator.prototype.generateTypingsFromShape = function generateTypingsFromShape(model, shapeKey, shape, tabCount, customClassNames) {
// some shapes shouldn't be generated if they are javascript primitives
var jsPrimitives = ['string', 'boolean', 'number'];
if (jsPrimitives.indexOf(shapeKey) >= 0) {
return '';
}
if (['Date', 'Blob'].indexOf(shapeKey) >= 0) {
shapeKey = '_' + shapeKey;
}
// In at least one (cloudfront.Signer) case, a class on a service namespace clashes with a shape
shapeKey = this.generateSafeShapeName(shapeKey, customClassNames);
var self = this;
var code = '';
tabCount = tabCount || 0;
var tabs = this.tabs;
var type = shape.type;
if (shape.eventstream) {
// eventstreams MUST be structures
var members = Object.keys(shape.members);
var events = members.map(function(member) {
// each member is an individual event type, so each must be optional
return member + '?:' + shape.members[member].shape;
});
return code += tabs(tabCount) + 'export type ' + shapeKey + ' = EventStream<{' + events.join(',') + '}>;\n';
}
if (type === 'structure') {
if (shape.isDocument) {
return code += tabs(tabCount) + 'export type ' + shapeKey + ' = DocumentType;\n'
}
code += tabs(tabCount) + 'export interface ' + shapeKey + ' {\n';
var members = shape.members;
// cycle through members
var memberKeys = Object.keys(members);
memberKeys.forEach(function(memberKey) {
// docs
var member = members[memberKey];
if (member.documentation) {
code += self.generateDocString(member.documentation, tabCount + 1);
}
var required = self.checkRequired(shape.required || [], memberKey) ? '' : '?';
var memberType = member.shape;
if (member.eventpayload) {
// eventpayloads are always either structures, or buffers
if (['blob', 'binary'].indexOf(model.shapes[memberType].type) >= 0) {
memberType = 'Buffer';
}
}
memberType = self.generateSafeShapeName(memberType, [].concat(customClassNames, ['Date', 'Blob']));
code += tabs(tabCount + 1) + memberKey + required + ': ' + memberType + ';\n';
});
code += tabs(tabCount) + '}\n';
} else if (type === 'list') {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = ' + this.generateSafeShapeName(shape.member.shape, customClassNames) + '[];\n';
} else if (type === 'map') {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = {[key: string]: ' + this.generateSafeShapeName(shape.value.shape, customClassNames) + '};\n';
} else if (type === 'string' || type === 'character') {
var stringType = 'string';
if (Array.isArray(shape.enum)) {
stringType = shape.enum.map(function(s) {
return '"' + s + '"';
}).join('|') + '|' + stringType;
}
code += tabs(tabCount) + 'export type ' + shapeKey + ' = ' + stringType + ';\n';
} else if (['double', 'long', 'short', 'biginteger', 'bigdecimal', 'integer', 'float'].indexOf(type) >= 0) {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = number;\n';
} else if (type === 'timestamp') {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = Date;\n';
} else if (type === 'boolean') {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = boolean;\n';
} else if (type === 'blob' || type === 'binary') {
code += tabs(tabCount) + 'export type ' + shapeKey + ' = Buffer|Uint8Array|Blob|string'
+ self.addReadableType(shapeKey)
+';\n';
}
return code;
};
/**
* Generates a class method type for an operation.
*/
TSGenerator.prototype.generateTypingsFromOperations = function generateTypingsFromOperations(className, operation, operationName, tabCount) {
var code = '';
tabCount = tabCount || 0;
var tabs = this.tabs;
var input = operation.input;
var output = operation.output;
operationName = operationName.charAt(0).toLowerCase() + operationName.substring(1);
var inputShape = input ? className + '.Types.' + input.shape : '{}';
var outputShape = output ? className + '.Types.' + output.shape : '{}';
if (input) {
code += this.generateDocString(operation.documentation, tabCount);
code += tabs(tabCount) + operationName + '(params: ' + inputShape + ', callback?: (err: AWSError, data: ' + outputShape + ') => void): Request<' + outputShape + ', AWSError>;\n';
}
code += this.generateDocString(operation.documentation, tabCount);
code += tabs(tabCount) + operationName + '(callback?: (err: AWSError, data: ' + outputShape + ') => void): Request<' + outputShape + ', AWSError>;\n';
return code;
};
TSGenerator.prototype.generateConfigurationServicePlaceholders = function generateConfigurationServicePlaceholders() {
/**
* Should create a config service placeholder
*/
var self = this;
var metadata = this.metadata;
// Iterate over every service
var serviceIdentifiers = Object.keys(metadata);
var code = '';
var configCode = '';
var versionsCode = '';
code += 'import * as AWS from \'../clients/all\';\n';
configCode += 'export abstract class ConfigurationServicePlaceholders {\n';
versionsCode += 'export interface ConfigurationServiceApiVersions {\n';
serviceIdentifiers.forEach(function(serviceIdentifier) {
var className = self.metadata[serviceIdentifier].name;
configCode += self.tabs(1) + serviceIdentifier + '?: AWS.' + className + '.Types.ClientConfiguration;\n';
versionsCode += self.tabs(1) + serviceIdentifier + '?: AWS.' + className + '.Types.apiVersion;\n';
});
configCode += '}\n';
versionsCode += '}\n';
code += configCode + versionsCode;
this.writeTypingsFile('config_service_placeholders', path.join(this._sdkRootDir, 'lib'), code);
};
TSGenerator.prototype.getServiceApiVersions = function generateServiceApiVersions(serviceIdentifier) {
var metadata = this.metadata;
var versions = metadata[serviceIdentifier].versions || [];
// transform results (to get rid of '*' and sort
versions = versions.map(function(version) {
return version.replace('*', '');
}).sort();
return versions;
};
/**
* Generates class method types for a waiter.
*/
TSGenerator.prototype.generateTypingsFromWaiters = function generateTypingsFromWaiters(className, waiterState, waiter, underlyingOperation, tabCount) {
var code = '';
tabCount = tabCount || 0;
var operationName = waiter.operation.charAt(0).toLowerCase() + waiter.operation.substring(1);
waiterState = waiterState.charAt(0).toLowerCase() + waiterState.substring(1);
var docString = 'Waits for the ' + waiterState + ' state by periodically calling the underlying ' + className + '.' + operationName + 'operation every ' + waiter.delay + ' seconds (at most ' + waiter.maxAttempts + ' times).';
if (waiter.description) {
docString += ' ' + waiter.description;
}
// get input and output
var inputShape = '{}';
var outputShape = '{}';
if (underlyingOperation.input) {
inputShape = className + '.Types.' + underlyingOperation.input.shape;
}
if (underlyingOperation.output) {
outputShape = className + '.Types.' + underlyingOperation.output.shape;
}
code += this.generateDocString(docString, tabCount);
code += this.tabs(tabCount) + 'waitFor(state: "' + waiterState + '", params: ' + inputShape + ' & {$waiter?: WaiterConfiguration}, callback?: (err: AWSError, data: ' + outputShape + ') => void): Request<' + outputShape + ', AWSError>;\n';
code += this.generateDocString(docString, tabCount);
code += this.tabs(tabCount) + 'waitFor(state: "' + waiterState + '", callback?: (err: AWSError, data: ' + outputShape + ') => void): Request<' + outputShape + ', AWSError>;\n';
return code;
};
/**
* Returns whether a service has customizations to include.
*/
TSGenerator.prototype.includeCustomService = function includeCustomService(serviceIdentifier) {
// check services directory
var servicesDir = path.join(this._sdkRootDir, 'lib', 'services');
var fileNames = fs.readdirSync(servicesDir);
fileNames = fileNames.filter(function(fileName) {
return fileName === serviceIdentifier + '.d.ts';
});
return !!fileNames.length;
};
/**
* Generates typings for classes that live on a service client namespace.
*/
TSGenerator.prototype.generateCustomNamespaceTypes = function generateCustomNamespaceTypes(serviceIdentifier, className) {
var self = this;
var tsCustomizationsJson = require('./ts-customizations');
var customClasses = [];
var code = '';
var serviceInfo = tsCustomizationsJson[serviceIdentifier] || null;
// exit early if no customizations found
if (!serviceInfo) {
return null;
}
code += 'declare namespace ' + className + ' {\n';
//generate import code
var importCode = '';
serviceInfo.forEach(function(data) {
var aliases = [];
var imports = data.imports || [];
imports.forEach(function(pair) {
aliases.push(pair.name + ' as ' + pair.alias);
code += self.tabs(1) + 'export import ' + pair.name + ' = ' + pair.alias + ';\n';
customClasses.push(pair.name);
});
if (aliases.length) {
importCode += 'import {' + aliases.join(', ') + '} from \'../' + data.path + '\';\n';
}
});
code += '}\n';
return {
importCode: importCode,
namespaceCode: code,
customClassNames: customClasses
};
};
TSGenerator.prototype.containsEventStreams = function containsEventStreams(model) {
var shapeNames = Object.keys(model.shapes);
for (var name of shapeNames) {
if (model.shapes[name].eventstream) {
return true;
}
}
return false;
};
TSGenerator.prototype.containsDocumentType = function containsDocumentType(model) {
var shapeNames = Object.keys(model.shapes);
for (var name of shapeNames) {
if (model.shapes[name].isDocument) {
return true;
}
}
return false;
};
/**
* Generates the typings for a service based on the serviceIdentifier.
*/
TSGenerator.prototype.processServiceModel = function processServiceModel(serviceIdentifier) {
var model = this.loadServiceApi(serviceIdentifier);
pruneShapes(model);
var self = this;
var code = '';
var className = this.metadata[serviceIdentifier].name;
var customNamespaces = this.generateCustomNamespaceTypes(serviceIdentifier, className);
var customClassNames = customNamespaces ? customNamespaces.customClassNames : [];
var waiters = model.waiters || Object.create(null);
var waiterKeys = Object.keys(waiters);
// generate imports
code += 'import {Request} from \'../lib/request\';\n';
code += 'import {Response} from \'../lib/response\';\n';
code += 'import {AWSError} from \'../lib/error\';\n';
var hasCustomizations = this.includeCustomService(serviceIdentifier);
var parentClass = hasCustomizations ? className + 'Customizations' : 'Service';
if (hasCustomizations) {
code += 'import {' + parentClass + '} from \'../lib/services/' + serviceIdentifier + '\';\n';
} else {
code += 'import {' + parentClass + '} from \'../lib/service\';\n';
}
if (waiterKeys.length) {
code += 'import {WaiterConfiguration} from \'../lib/service\';\n';
}
code += 'import {ServiceConfigurationOptions} from \'../lib/service\';\n';
// get any custom config options
var customConfig = this.generateCustomConfigFromMetadata(serviceIdentifier);
var hasCustomConfig = !!customConfig.length;
var customConfigTypes = ['ServiceConfigurationOptions'];
code += 'import {ConfigBase as Config} from \'../lib/config-base\';\n';
if (hasCustomConfig) {
// generate import statements and custom config type
customConfig.forEach(function(config) {
code += 'import {' + config.INTERFACE + '} from \'../lib/' + config.FILE_NAME + '\';\n';
customConfigTypes.push(config.INTERFACE);
});
}
if (this.containsEventStreams(model)) {
code += 'import {EventStream} from \'../lib/event-stream/event-stream\';\n';
}
if (this.containsDocumentType(model)) {
code += 'import {DocumentType} from \'../lib/model\';\n';
}
// import custom namespaces
if (customNamespaces) {
code += customNamespaces.importCode;
}
code += 'interface Blob {}\n';
// generate methods
var modelOperations = model.operations;
var operationKeys = Object.keys(modelOperations);
code += 'declare class ' + className + ' extends ' + parentClass + ' {\n';
// create constructor
code += this.generateDocString('Constructs a service object. This object has one method for each API operation.', 1);
code += this.tabs(1) + 'constructor(options?: ' + className + '.Types.ClientConfiguration' + ')\n';
code += this.tabs(1) + 'config: Config & ' + className + '.Types.ClientConfiguration' + ';\n';
operationKeys.forEach(function (operationKey) {
code += self.generateTypingsFromOperations(className, modelOperations[operationKey], operationKey, 1);
});
// generate waitFor methods
waiterKeys.forEach(function (waitersKey) {
var waiter = waiters[waitersKey];
var operation = modelOperations[waiter.operation];
code += self.generateTypingsFromWaiters(className, waitersKey, waiter, operation, 1);
});
code += '}\n';
// check for static classes on namespace
if (customNamespaces) {
code += customNamespaces.namespaceCode;
}
// shapes should map to interfaces
var modelShapes = model.shapes;
// iterate over each shape
var shapeKeys = Object.keys(modelShapes);
code += 'declare namespace ' + className + ' {\n';
// preprocess shapes to fetch out needed dependency. e.g. "streaming": true
shapeKeys.forEach(function (shapeKey) {
var modelShape = modelShapes[shapeKey];
var streamTypeList = self.extractTypesDependOnStream(shapeKey, modelShape);
for (var streamType of streamTypeList) {
self.streamTypes[streamType] = true;
}
});
shapeKeys.forEach(function (shapeKey) {
var modelShape = modelShapes[shapeKey];
code += self.generateTypingsFromShape(model, shapeKey, modelShape, 1, customClassNames);
});
//add extra dependencies like 'streaming'
if (Object.keys(self.streamTypes).length !== 0) {
var insertPos = code.indexOf('interface Blob {}');
code = code.slice(0, insertPos) + 'import {Readable} from \'stream\';\n' + code.slice(insertPos);
}
this.streamTypes = {};
code += this.generateDocString('A string in YYYY-MM-DD format that represents the latest possible API version that can be used in this service. Specify \'latest\' to use the latest possible version.', 1);
code += this.tabs(1) + 'export type apiVersion = "' + this.getServiceApiVersions(serviceIdentifier).join('"|"') + '"|"latest"|string;\n';
code += this.tabs(1) + 'export interface ClientApiVersions {\n';
code += this.generateDocString('A string in YYYY-MM-DD format that represents the latest possible API version that can be used in this service. Specify \'latest\' to use the latest possible version.', 2);
code += this.tabs(2) + 'apiVersion?: apiVersion;\n';
code += this.tabs(1) + '}\n';
code += this.tabs(1) + 'export type ClientConfiguration = ' + customConfigTypes.join(' & ') + ' & ClientApiVersions;\n';
// export interfaces under Types namespace for backwards-compatibility
code += this.generateDocString('Contains interfaces for use with the ' + className + ' client.', 1);
code += this.tabs(1) + 'export import Types = ' + className + ';\n';
code += '}\n';
code += 'export = ' + className + ';\n';
return code;
};
/**
* Write Typings file to the specified directory.
*/
TSGenerator.prototype.writeTypingsFile = function writeTypingsFile(name, directory, code) {
fs.writeFileSync(path.join(directory, name + '.d.ts'), code);
};
/**
* Create the typescript definition files for every service.
*/
TSGenerator.prototype.generateAllClientTypings = function generateAllClientTypings() {
this.fillApiModelFileNames();
var self = this;
var metadata = this.metadata;
// Iterate over every service
var serviceIdentifiers = Object.keys(metadata);
serviceIdentifiers.forEach(function(serviceIdentifier) {
var code = self.processServiceModel(serviceIdentifier);
self.writeTypingsFile(serviceIdentifier, self._clientsDir, code);
});
};
/**
* Create the typescript definition files for the all and browser_default exports.
*/
TSGenerator.prototype.generateGroupedClients = function generateGroupedClients() {
var metadata = this.metadata;
var allCode = '';
var browserCode = '';
// Iterate over every service
var serviceIdentifiers = Object.keys(metadata);
serviceIdentifiers.forEach(function(serviceIdentifier) {
var className = metadata[serviceIdentifier].name;
var code = 'export import ' + className + ' = require(\'./' + serviceIdentifier + '\');\n';
allCode += code;
if (metadata[serviceIdentifier].cors) {
browserCode += code;
}
});
this.writeTypingsFile('all', this._clientsDir, allCode);
this.writeTypingsFile('browser_default', this._clientsDir, browserCode);
};
module.exports = TSGenerator;

View File

@@ -0,0 +1,40 @@
/**
*
* @param {string} startingShape
* @param {{[key: string]: any}} shapeMap
*/
function visitRelatedShapeNames(startingShape, shapeMap) {
var shape = shapeMap[startingShape];
if (shape.visited) {
// exit early if the shape has been visited
return;
}
shape.visited = true;
if (['structure', 'map', 'list'].indexOf(shape.type) < 0) {
// not a complex shape, so it's a terminal shape
return;
}
if (shape.type === 'structure') {
var members = shape.members;
for (var memberName of Object.keys(members)) {
var memberShapeName = members[memberName].shape;
visitRelatedShapeNames(memberShapeName, shapeMap);
}
} else if (shape.type === 'map') {
var keyShape = shape.key.shape;
var valueShape = shape.value.shape;
visitRelatedShapeNames(keyShape, shapeMap);
visitRelatedShapeNames(valueShape, shapeMap);
} else if (shape.type === 'list') {
var memberShape = shape.member.shape;
visitRelatedShapeNames(memberShape, shapeMap);
}
}
module.exports = {
visitRelatedShapeNames: visitRelatedShapeNames
};

View File

@@ -0,0 +1,62 @@
var allowlist = {
'/config.js': [
24,
25,
85,
86,
207,
261,
262
],
'/credentials/cognito_identity_credentials.js': [
78,
79,
109
],
'/credentials/shared_ini_file_credentials.js': [
4,
],
'/credentials/sso_credentials.js': [
15,
],
'/http.js': [
5
],
'/rds/signer.js': [
43,
44,
97,
99,
110,
112
],
'/region/utils.js': [
10
],
'/request.js': [
318,
319
],
'/services/s3.js': [
87,
88,
260,
262,
275,
281,
641,
643,
762,
773,
774,
775,
780
],
'/token/sso_token_provider.js': [
60
]
};
module.exports = {
allowlist: allowlist
};

View File

@@ -0,0 +1,75 @@
var fs = require('fs');
var path = require('path');
var allowlist = require('./allowlist').allowlist;
function checkFile(location) {
var file = fs.readFileSync(location);
var code = file.toString();
var lines = code.split('\n');
var regionMatches = [];
lines.forEach(function(line, idx) {
var matches = line.match(/(us|eu|ap|sa|ca)-\w+-\d+/g);
if (matches) {
regionMatches.push({
file: location,
line: idx,
code: line
});
}
});
return regionMatches;
}
function recursiveGetFilesIn(directory, extensions) {
var filenames = [];
var keys = fs.readdirSync(directory);
for (var i = 0, iLen = keys.length; i < iLen; i++) {
// check if it is a file
var keyPath = path.join(directory, keys[i]);
var stats = fs.statSync(keyPath);
if (stats.isDirectory()) {
filenames = filenames.concat(
recursiveGetFilesIn(keyPath, extensions)
);
continue;
}
if (extensions.indexOf(path.extname(keyPath)) >= 0) {
filenames.push(path.join(keyPath));
}
}
return filenames;
}
function checkForRegions() {
var libPath = path.join(__dirname, '..', '..', 'lib');
var filePaths = recursiveGetFilesIn(libPath, ['.js']);
var regionMatches = [];
var warnings = [];
filePaths.forEach(function(filePath) {
regionMatches = regionMatches.concat(checkFile(filePath));
});
regionMatches.forEach(function(match) {
var normalizedPath = match.file.substring(libPath.length);
if (allowlist[normalizedPath] && allowlist[normalizedPath].indexOf(match.line) >= 0) {
return;
}
warnings.push('File: ' + normalizedPath + '\tLine ' + match.line + ':\t' + match.code.trim());
});
if (warnings.length) {
console.error('Hard-coded regions detected. This should only be done if absolutely certain!');
warnings.forEach(function(warning) {
console.error(warning);
});
process.exit(1);
}
}
checkForRegions();

View File

@@ -0,0 +1,14 @@
"use strict";
var fs_1 = require("fs");
var path_1 = require("path");
var clients = require('../clients/all');
var metadata = require('../apis/metadata');
var api_loader = require('../lib/api_loader');
fs_1.writeFileSync(path_1.resolve(__dirname, '..', 'SERVICES.md'), Object.keys(clients).reduce(function (serviceTable, clientId) {
var cid = clientId.toLowerCase();
return serviceTable + Object.keys(api_loader.services[cid]).reverse()
.map(function (version) {
var model = api_loader(cid, version);
return model.metadata.serviceFullName + " | AWS." + clientId + " | " + version + " | " + (metadata[cid].cors === true ? ':tada:' : '') + " |";
}).join("\n") + "\n";
}, "The SDK currently supports the following services:\n\n<p class=\"note\"><strong>Note</strong>:\nAlthough all services are supported in the browser version of the SDK,\nnot all of the services are available in the default hosted build (using the\nscript tag provided above). Instructions on how to build a\ncustom version of the SDK with individual services are provided\nin the \"<a href=\"http://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/building-sdk-for-browsers.html\">Building the SDK for Browsers</a>\" section of the SDK Developer Guide.\n</p>\n\nService Name | Class Name | API Version | Allows CORS |\n------------ | ---------- | ----------- | ----------- |\n"));

View File

@@ -0,0 +1,29 @@
import {writeFileSync} from 'fs';
import {resolve} from 'path';
const clients = require('../clients/all');
const metadata = require('../apis/metadata');
const api_loader = require('../lib/api_loader');
writeFileSync(
resolve(__dirname, '..', 'SERVICES.md'),
Object.keys(clients).reduce((serviceTable, clientId): string => {
const cid = clientId.toLowerCase();
return serviceTable + Object.keys(api_loader.services[cid]).reverse()
.map((version: string): string => {
const model = api_loader(cid, version);
return `${model.metadata.serviceFullName} | AWS.${clientId} | ${version} | ${metadata[cid].cors === true ? ':tada:' : ''} |`;
}).join("\n") + "\n";
}, `The SDK currently supports the following services:
<p class="note"><strong>Note</strong>:
Although all services are supported in the browser version of the SDK,
not all of the services are available in the default hosted build (using the
script tag provided above). Instructions on how to build a
custom version of the SDK with individual services are provided
in the "<a href="http://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/building-sdk-for-browsers.html">Building the SDK for Browsers</a>" section of the SDK Developer Guide.
</p>
Service Name | Class Name | API Version | Allows CORS |
------------ | ---------- | ----------- | ----------- |
`));

View File

@@ -0,0 +1,69 @@
#!/usr/bin/env node
var fs = require('fs');
var Translator = require('./lib/translator');
var removeEventStreamOperations = require('./lib/remove-event-stream-ops').removeEventStreamOperations;
var util = require('util');
var path = require('path');
/*
* Minimizes all .normal.json files by flattening shapes, removing
* documentation and removing unused shapes. The result will be written to
* `.min.json` file.
*
* The passed parameter is base path. The directory must include the apis/
* folder.
*/
function ApiTranslator(basePath) {
this._apisPath = path.join(basePath, 'apis');
}
/*
* minimize passed .normal.json filepath into .min.json
*/
ApiTranslator.prototype.minimizeFile = function minimizeFile(filepath) {
var opath = filepath.replace(/\.normal\.json$/, '.min.json');
var data = JSON.parse(fs.readFileSync(path.join(this._apisPath, filepath)).toString());
var didModify = removeEventStreamOperations(data);
if (didModify) {
// original model modified, replace existing normal.json so docs/ts definitions are accurate
fs.writeFileSync(path.join(this._apisPath, filepath), JSON.stringify(data, null, ' '));
}
var translated = new Translator(data, {documentation: false});
var json = JSON.stringify(translated, null, ' ');
fs.writeFileSync(path.join(this._apisPath, opath), json);
};
/*
* minimize files in api path. If optional modelName is passed only that model
* is minimized otherwise all .normal.json files found.
*/
ApiTranslator.prototype.translateAll = function translateAll(modelName) {
var paths = fs.readdirSync(this._apisPath);
var self = this;
paths.forEach(function(filepath) {
if (filepath.endsWith('.normal.json')) {
if (!modelName || filepath.startsWith(modelName)) {
self.minimizeFile(filepath);
}
}
});
};
/*
* if executed as script initialize the ApiTranslator and minimize API files
*
* Optional first parameter specifies which model to minimize. If omitted all
* files are selected.
*
* Optional second parameter specifies base path. The directory must include
* the apis/ folder with .normal.json files. Output is written into the same
* path. If parameter is not passed the repository root will be used.
*/
if (require.main === module) {
var modelName = process.argv[2] || '';
var basePath = process.argv[3] || path.join(__dirname, '..');
new ApiTranslator(basePath).translateAll(modelName);
}
module.exports = ApiTranslator;

View File

@@ -0,0 +1,25 @@
#!/usr/bin/env node
/*
* Pass optional path to target directory as command line argument.
*
* The directory must include the apis/ folder and service customizations at
* `lib/services`. Clients will be generated in `clients/`.
*
* If parameter is not passed the repository root will be used.
*/
var path = require('path');
var TSGenerator = require('./lib/ts-generator');
var basePath = process.argv[2] || path.join(__dirname, '..');
var tsGenerator = new TSGenerator({
SdkRootDirectory: basePath
});
tsGenerator.generateAllClientTypings();
tsGenerator.generateGroupedClients();
tsGenerator.updateDynamoDBDocumentClient();
tsGenerator.generateConfigurationServicePlaceholders();
console.log('TypeScript Definitions created.');