refactor: use core supplied mappings

This commit is contained in:
Wieland Schöbl
2021-09-03 15:17:15 +00:00
committed by Rainer Killinger
parent 614a1b1e9b
commit 43a89ec4f2
22 changed files with 1622 additions and 1762 deletions

View File

@@ -4,8 +4,10 @@ This project is a reference implementation for a StApps backend. It provides an
perform full text search, sorts and filters. It also delivers the configuration needed by the app. The API is specified perform full text search, sorts and filters. It also delivers the configuration needed by the app. The API is specified
within the [@openstapps/core](https://gitlab.com/openstapps/core). within the [@openstapps/core](https://gitlab.com/openstapps/core).
If you want to perform requests, index data or search within JavaScript or TypeScript you should consider using If you want to perform requests, index data or search within JavaScript or TypeScript you should consider using our client
[@openstapps/api](https://gitlab.com/openstapps/api) [@openstapps/api](https://gitlab.com/openstapps/api).
Or generate your own client using the openapi/swagger definitions you can get form the [API documentation](https://openstapps.gitlab.io/backend).
# Usage # Usage
This backend is not a standalone software. It needs a database like Elasticsearch to work. This backend is not a standalone software. It needs a database like Elasticsearch to work.
@@ -16,38 +18,15 @@ you with everything you need to run this backend.
# Local usage for development purposes # Local usage for development purposes
## Requirements ## Requirements
* Elasticsearch (5.5) * Elasticsearch (5.6)
* Node.js (~10) / NPM * Node.js (~14) / NPM
* Docker * Docker
## Generating Elasticsearch Mapping
The mappings will be generated automatically on the first start. If there are any errors, the backend will inform you and stop
the execution, however it will do its best to complete the mappings. You can then either resolve these errors in the `core-tools` or the `core`, depending on where it originated.
If you need a quick solution, you can also take the generated output file and manually correct the errors, then rename it to `[coreVersion]_template_[type].json` (replace any spaces with a `_`)
and restart the backend (make sure that you don't have `ES_FORCE_MAPPING_UPDATE` set to `true`). This time it will take your file. *The filenames and the path will also be displayed in the log of the backend.*
### Manually Resolving Errors
There are multiple types of errors the backend can run into. Manual error resolving requires you to be familiar with Elasticsearch
mappings.
An error will be represented in the output through an Elasticsearch type written in CAPS. Refer to either the console output
or the `[coreVersion]_error_report.txt` for more info. If you feel lucky you can try to replace every error (`"type": "MISSING_PREMAP"`,
`"type": "PARSE_ERROR"`, `"type": "TYPE_CONFLICT"`) with
```json
"dynamic": true,
"properties": {}
```
This should ONLY be used as a temporary workaround and might compromise other features.
### Startup Behaviour ### Startup Behaviour
*This might be important if you work on the Core* *This might be important if you work on the Core*
The backend is using the `core-tools` to automatically generate Elasticsearch Mappings and Aggregations from the current `core` version. The backend is using Elasticsearch Mappings and Aggregations from its currently used `core` dependency.
By default, the backend creates a local copy of the generated mappings and aggregations in `src/storage/elasticsearch/templates/[coreVersion]_template_[type].json` and `src/storage/elasticsearch/templates/[coreVersion]_aggregations.json`.
On each start, it first checks if the aggregation file exists, this is because it does not know which of the types actually exist for the current core version. If the file does exist, it will just use the existing files and *not* generate a new mapping to cut down the time
it takes to start the backend. When you are working on the Core, you might not want to have this behaviour, you can then either delete
the generated file at each start or run the backend with the environment variable `ES_FORCE_MAPPING_UPDATE=true`. This will cause it to generate the mapping
each time starts regardless of whether there are already files there.
## Start Database (Elasticsearch) ## Start Database (Elasticsearch)
Elasticsearch needs some configuration and plugins to be able to work Elasticsearch needs some configuration and plugins to be able to work
@@ -95,7 +74,6 @@ The list of environment variables includes:
* `NODE_ENV` when set to `production`, there will be a reduced amount of output from the logger * `NODE_ENV` when set to `production`, there will be a reduced amount of output from the logger
* `PORT` when this is not set, the backend will default to port 3000 * `PORT` when this is not set, the backend will default to port 3000
* `ES_ADDR` the Elasticsearch address, if not set it will default the Elasticsearch address to `http://localhost:9200` * `ES_ADDR` the Elasticsearch address, if not set it will default the Elasticsearch address to `http://localhost:9200`
* `ES_FORCE_MAPPING_UPDATE` when this variable is set to `true`, the backend will always generate a new Elasticsearch mapping from the core regardless of whether there is already a version present. This should only really be used when you are working on the core.
* `ALLOW_NO_TRANSPORT` if set to true, the backend will allow starting without an Email configured that receives critical errors. * `ALLOW_NO_TRANSPORT` if set to true, the backend will allow starting without an Email configured that receives critical errors.
* `ES_DEBUG` setting this to `true` will result in Elasticsearch logging to be **VERY** extensive, in almost all situation this should no be enabled. * `ES_DEBUG` setting this to `true` will result in Elasticsearch logging to be **VERY** extensive, in almost all situation this should no be enabled.
* `PROMETHEUS_MIDDLEWARE` if set to `true` will enable metrics collection with [Express Prometheus Middleware](https://www.npmjs.com/package/express-prometheus-middleware) * `PROMETHEUS_MIDDLEWARE` if set to `true` will enable metrics collection with [Express Prometheus Middleware](https://www.npmjs.com/package/express-prometheus-middleware)

View File

@@ -1,7 +1,7 @@
// tslint:disable:no-default-export // tslint:disable:no-default-export
// tslint:disable:no-magic-numbers // tslint:disable:no-magic-numbers
import {RecursivePartial} from '@openstapps/logger/lib/common'; import {RecursivePartial} from '@openstapps/logger/lib/common';
import {ElasticsearchConfigFile} from '../src/storage/elasticsearch/common'; import {ElasticsearchConfigFile} from '../src/storage/elasticsearch/types/elasticsearch';
/** /**
* This is the database configuration for the technical university of berlin * This is the database configuration for the technical university of berlin

View File

@@ -1,6 +1,6 @@
// tslint:disable:no-default-export // tslint:disable:no-default-export
// tslint:disable:no-magic-numbers // tslint:disable:no-magic-numbers
import {ElasticsearchConfigFile} from '../src/storage/elasticsearch/common'; import {ElasticsearchConfigFile} from '../src/storage/elasticsearch/types/elasticsearch';
/** /**
* This is the default configuration for elasticsearch (a database) * This is the default configuration for elasticsearch (a database)

View File

@@ -10,7 +10,6 @@ services:
NODE_CONFIG_ENV: "elasticsearch" NODE_CONFIG_ENV: "elasticsearch"
NODE_ENV: "integration-test" NODE_ENV: "integration-test"
ALLOW_NO_TRANSPORT: "true" ALLOW_NO_TRANSPORT: "true"
ES_FORCE_MAPPING_UPDATE: "true"
ES_ADDR: "http://elasticsearch:9200" ES_ADDR: "http://elasticsearch:9200"
elasticsearch: elasticsearch:
@@ -25,4 +24,4 @@ services:
STAPPS_EXIT_LEVEL: "8" STAPPS_EXIT_LEVEL: "8"
volumes: volumes:
- ./node_modules/@openstapps/core/test/resources:/@openstapps/core/test/resources:ro - ./node_modules/@openstapps/core/test/resources:/@openstapps/core/test/resources:ro
command: e2e http://backend:3000 --waiton tcp:backend:3000 --samples /@openstapps/core/test/resources command: e2e http://backend:3000 --waiton tcp:backend:3000 --samples /@openstapps/core/test/resources/indexable

1760
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -25,41 +25,39 @@
"preversion": "npm run prepublishOnly", "preversion": "npm run prepublishOnly",
"push": "git push && git push origin \"v$npm_package_version\"", "push": "git push && git push origin \"v$npm_package_version\"",
"start": "NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true node ./lib/cli.js", "start": "NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true node ./lib/cli.js",
"start-debug": "STAPPS_LOG_LEVEL=31 NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true ES_FORCE_MAPPING_UPDATE=true node ./lib/cli.js --require ts-node/register", "start-debug": "STAPPS_LOG_LEVEL=31 NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true node ./lib/cli.js --require ts-node/register",
"test": "npm run test-unit && npm run test-integration", "test": "npm run test-unit && npm run test-integration",
"test-unit": "env NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true ES_FORCE_MAPPING_UPDATE=true STAPPS_LOG_LEVEL=0 nyc mocha --require ts-node/register --exit 'test/**/*.spec.ts'", "test-unit": "env NODE_CONFIG_ENV=elasticsearch ALLOW_NO_TRANSPORT=true STAPPS_LOG_LEVEL=0 nyc mocha --require ts-node/register --exit 'test/**/*.spec.ts'",
"test-integration": "sudo docker-compose -f integration-test.yml pull && sudo docker-compose -f integration-test.yml up --build --abort-on-container-exit --exit-code-from apicli", "test-integration": "sudo docker-compose -f integration-test.yml pull && sudo docker-compose -f integration-test.yml up --build --abort-on-container-exit --exit-code-from apicli",
"tslint": "tslint -p tsconfig.json -c tslint.json 'src/**/*.ts'" "tslint": "tslint -p tsconfig.json -c tslint.json 'src/**/*.ts'"
}, },
"dependencies": { "dependencies": {
"@elastic/elasticsearch": "5.6.22", "@elastic/elasticsearch": "5.6.22",
"@openstapps/core": "0.48.0", "@openstapps/core": "0.50.0",
"@openstapps/core-tools": "0.23.2", "@openstapps/core-tools": "0.25.0",
"@openstapps/logger": "0.7.0", "@openstapps/logger": "0.7.0",
"@types/express-prometheus-middleware": "1.2.1", "@types/express-prometheus-middleware": "1.2.1",
"@types/node": "14.17.7", "@types/node": "14.17.12",
"commander": "7.2.0",
"config": "3.3.6", "config": "3.3.6",
"cors": "2.8.5", "cors": "2.8.5",
"express": "4.17.1", "express": "4.17.1",
"express-prometheus-middleware": "1.2.0", "express-prometheus-middleware": "1.2.0",
"express-promise-router": "4.1.0", "express-promise-router": "4.1.0",
"fs-extra": "9.1.0",
"got": "11.8.2", "got": "11.8.2",
"moment": "2.29.1", "moment": "2.29.1",
"morgan": "1.10.0", "morgan": "1.10.0",
"nock": "13.1.1", "nock": "13.1.3",
"node-cache": "5.1.2", "node-cache": "5.1.2",
"node-cron": "3.0.0", "node-cron": "3.0.0",
"nodemailer": "6.6.3", "nodemailer": "6.6.3",
"prom-client": "12.0.0", "prom-client": "13.2.0",
"promise-queue": "2.2.5", "promise-queue": "2.2.5",
"sanitize-filename": "1.6.3", "ts-node": "10.2.1",
"ts-node": "9.1.1",
"uuid": "8.3.2" "uuid": "8.3.2"
}, },
"devDependencies": { "devDependencies": {
"@openstapps/configuration": "0.27.0", "@openstapps/configuration": "0.27.0",
"@openstapps/es-mapping-generator": "0.0.3",
"@testdeck/mocha": "0.1.2", "@testdeck/mocha": "0.1.2",
"@types/chai": "4.2.21", "@types/chai": "4.2.21",
"@types/chai-as-promised": "7.1.4", "@types/chai-as-promised": "7.1.4",
@@ -67,9 +65,8 @@
"@types/cors": "2.8.12", "@types/cors": "2.8.12",
"@types/elasticsearch": "5.0.38", "@types/elasticsearch": "5.0.38",
"@types/express": "4.17.13", "@types/express": "4.17.13",
"@types/fs-extra": "9.0.12",
"@types/geojson": "1.0.6", "@types/geojson": "1.0.6",
"@types/mocha": "8.2.3", "@types/mocha": "9.0.0",
"@types/morgan": "1.9.3", "@types/morgan": "1.9.3",
"@types/node-cron": "2.0.4", "@types/node-cron": "2.0.4",
"@types/nodemailer": "6.4.4", "@types/nodemailer": "6.4.4",
@@ -81,17 +78,17 @@
"chai-as-promised": "7.1.1", "chai-as-promised": "7.1.1",
"conventional-changelog-cli": "2.1.1", "conventional-changelog-cli": "2.1.1",
"get-port": "5.1.1", "get-port": "5.1.1",
"mocha": "8.4.0", "mocha": "9.1.1",
"mocked-env": "1.3.5", "mocked-env": "1.3.5",
"nyc": "15.1.0", "nyc": "15.1.0",
"prepend-file-cli": "1.0.6", "prepend-file-cli": "1.0.6",
"redoc-cli": "0.12.2", "redoc-cli": "0.12.3",
"rimraf": "3.0.2", "rimraf": "3.0.2",
"sinon": "10.0.0", "sinon": "11.1.2",
"sinon-express-mock": "2.2.1", "sinon-express-mock": "2.2.1",
"supertest": "6.1.4", "supertest": "6.1.6",
"tslint": "6.1.3", "tslint": "6.1.3",
"typedoc": "0.18.0", "typedoc": "0.21.9",
"typescript": "3.8.3" "typescript": "3.8.3"
}, },
"nyc": { "nyc": {

View File

@@ -19,7 +19,7 @@ import {
SCRoute, SCRoute,
SCValidationErrorResponse, SCValidationErrorResponse,
} from '@openstapps/core'; } from '@openstapps/core';
import {ValidationError} from '@openstapps/core-tools/lib/common'; import {ValidationError} from '@openstapps/core-tools/src/types/validator';
import {Logger} from '@openstapps/logger'; import {Logger} from '@openstapps/logger';
import {Application, Router} from 'express'; import {Application, Router} from 'express';
import PromiseRouter from 'express-promise-router'; import PromiseRouter from 'express-promise-router';

View File

@@ -14,44 +14,31 @@
* along with this program. If not, see <https://www.gnu.org/licenses/>. * along with this program. If not, see <https://www.gnu.org/licenses/>.
*/ */
import {SCFacet, SCThingType} from '@openstapps/core'; import {SCFacet, SCThingType} from '@openstapps/core';
import {readFileSync} from 'fs'; import {aggregations} from './templating';
import {AggregationResponse} from './types/elasticsearch';
import { import {
AggregationResponse,
AggregationSchema,
isBucketAggregation, isBucketAggregation,
isESAggMatchAllFilter, isESAggMatchAllFilter,
isESNestedAggregation, isESNestedAggregation,
isESTermsFilter, isESTermsFilter,
isNestedAggregation, isNestedAggregation,
} from './common'; } from './types/guards';
import {aggregationsPath} from './templating';
/**
* Builds the aggregation
* @returns a schema to tell elasticsearch which aggregations to collect
*/
export function buildAggregations(): AggregationSchema {
return JSON.parse((readFileSync(aggregationsPath, 'utf8')).toString());
}
/** /**
* Parses elasticsearch aggregations (response from es) to facets for the app * Parses elasticsearch aggregations (response from es) to facets for the app
* @param aggregationSchema - aggregation-schema for elasticsearch * @param aggregationResponse - aggregations response from elasticsearch
* @param aggregations - aggregations response from elasticsearch
*/ */
export function parseAggregations( export function parseAggregations(aggregationResponse: AggregationResponse): SCFacet[] {
aggregationSchema: AggregationSchema,
aggregations: AggregationResponse): SCFacet[] {
const facets: SCFacet[] = []; const facets: SCFacet[] = [];
// get all names of the types an aggregation is on // get all names of the types an aggregation is on
for (const typeName in aggregationSchema) { for (const typeName in aggregations) {
if (aggregationSchema.hasOwnProperty(typeName) && aggregations.hasOwnProperty(typeName)) { if (aggregations.hasOwnProperty(typeName) && aggregationResponse.hasOwnProperty(typeName)) {
// the type object from the schema // the type object from the schema
const type = aggregationSchema[typeName]; const type = aggregations[typeName];
// the "real" type object from the response // the "real" type object from the response
const realType = aggregations[typeName]; const realType = aggregationResponse[typeName];
// both conditions must apply, else we have an error somewhere // both conditions must apply, else we have an error somewhere
if (isESNestedAggregation(type) && isNestedAggregation(realType)) { if (isESNestedAggregation(type) && isNestedAggregation(realType)) {

View File

@@ -33,17 +33,16 @@ import moment from 'moment';
import {MailQueue} from '../../notification/mail-queue'; import {MailQueue} from '../../notification/mail-queue';
import {Bulk} from '../bulk-storage'; import {Bulk} from '../bulk-storage';
import {Database} from '../database'; import {Database} from '../database';
import {buildAggregations, parseAggregations} from './aggregations'; import {parseAggregations} from './aggregations';
import * as Monitoring from './monitoring';
import {buildQuery, buildSort} from './query';
import {aggregations, putTemplate} from './templating';
import { import {
AggregationResponse, AggregationResponse,
AggregationSchema,
ElasticsearchConfig, ElasticsearchObject, ElasticsearchConfig, ElasticsearchObject,
ElasticsearchQueryDisMaxConfig, ElasticsearchQueryDisMaxConfig,
ElasticsearchQueryQueryStringConfig, ElasticsearchQueryQueryStringConfig,
} from './common'; } from './types/elasticsearch';
import * as Monitoring from './monitoring';
import {buildQuery, buildSort} from './query';
import {checkESTemplate, putTemplate} from './templating';
/** /**
* Matches index names such as stapps_<type>_<source>_<random suffix> * Matches index names such as stapps_<type>_<source>_<random suffix>
@@ -60,11 +59,6 @@ export class Elasticsearch implements Database {
*/ */
static readonly INDEX_UID_LENGTH = 8; static readonly INDEX_UID_LENGTH = 8;
/**
* Holds aggregations
*/
aggregationsSchema: AggregationSchema;
/** /**
* Holds a map of all elasticsearch indices that are available to search * Holds a map of all elasticsearch indices that are available to search
*/ */
@@ -207,11 +201,6 @@ export class Elasticsearch implements Database {
this.aliasMap = {}; this.aliasMap = {};
this.ready = false; this.ready = false;
checkESTemplate(typeof process.env.ES_FORCE_MAPPING_UPDATE !== 'undefined' ?
process.env.ES_FORCE_MAPPING_UPDATE === 'true' : false);
this.aggregationsSchema = buildAggregations();
this.mailQueue = mailQueue; this.mailQueue = mailQueue;
} }
@@ -221,6 +210,8 @@ export class Elasticsearch implements Database {
private async getAliasMap() { private async getAliasMap() {
// delay after which alias map will be fetched again // delay after which alias map will be fetched again
const RETRY_INTERVAL = 5000; const RETRY_INTERVAL = 5000;
// maximum number of retries
const RETRY_COUNT = 3;
// create a list of old indices that are not in use // create a list of old indices that are not in use
const oldIndicesToDelete: string[] = []; const oldIndicesToDelete: string[] = [];
@@ -233,17 +224,24 @@ export class Elasticsearch implements Database {
[K in SCThingType]: unknown [K in SCThingType]: unknown
}; };
}; };
}; } | undefined;
for(const retry of [...Array(RETRY_COUNT)].map((_, i) => i+1)) {
if (typeof aliases !== 'undefined') {
break;
}
try { try {
aliases = (await this.client.indices.getAlias({})).body; const aliasResponse = await this.client.indices.getAlias({});
aliases = aliasResponse.body;
} catch (error) { } catch (error) {
await Logger.error('Failed getting alias map:', error); Logger.warn('Failed getting alias map:', error);
setTimeout(async () => { Logger.warn(`Retrying in ${RETRY_INTERVAL} milliseconds. (${retry} of ${RETRY_COUNT})`);
return this.getAliasMap(); await new Promise(resolve => setTimeout(resolve, RETRY_INTERVAL));
}, RETRY_INTERVAL); // retry after a delay }
}
return; if (typeof aliases === 'undefined') {
throw Error(`Failed to retrieve alias map after ${RETRY_COUNT} attempts!`);
} }
for (const index in aliases) { for (const index in aliases) {
@@ -566,7 +564,7 @@ export class Elasticsearch implements Database {
const searchRequest: RequestParams.Search = { const searchRequest: RequestParams.Search = {
body: { body: {
aggs: this.aggregationsSchema, // use cached version of aggregations (they only change if config changes) aggs: aggregations,
query: buildQuery(params, this.config, esConfig), query: buildQuery(params, this.config, esConfig),
}, },
from: params.from, from: params.from,
@@ -603,7 +601,7 @@ export class Elasticsearch implements Database {
// read the aggregations from elasticsearch and parse them to facets by our configuration // read the aggregations from elasticsearch and parse them to facets by our configuration
if (typeof response.body.aggregations !== 'undefined') { if (typeof response.body.aggregations !== 'undefined') {
facets = parseAggregations(this.aggregationsSchema, response.body.aggregations as AggregationResponse); facets = parseAggregations(response.body.aggregations as AggregationResponse);
} }
return { return {

View File

@@ -45,7 +45,7 @@ import {
ESTermFilter, ESTermFilter,
ESTypeFilter, ESTypeFilter,
ScriptSort, ScriptSort,
} from './common'; } from './types/elasticsearch';
/** /**
* Escapes any reserved character that would otherwise not be accepted by Elasticsearch * Escapes any reserved character that would otherwise not be accepted by Elasticsearch

View File

@@ -1,2 +0,0 @@
*.json
*.txt

View File

@@ -15,77 +15,18 @@
*/ */
import {Client} from '@elastic/elasticsearch'; import {Client} from '@elastic/elasticsearch';
import {SCThingType} from '@openstapps/core'; import {SCThingType} from '@openstapps/core';
import {getProjectReflection} from '@openstapps/core-tools/lib/common'; // tslint:disable-next-line:no-implicit-dependencies
import {generateTemplate} from '@openstapps/core-tools/lib/mapping'; import {AggregationSchema} from '@openstapps/es-mapping-generator/src/types/aggregation';
import {Logger} from '@openstapps/logger'; // tslint:disable-next-line:no-implicit-dependencies
import {existsSync, writeFileSync} from 'fs'; import {ElasticsearchTemplateCollection} from '@openstapps/es-mapping-generator/src/types/mapping';
import {readFile} from 'fs-extra'; import {readFileSync} from 'fs';
import {resolve} from 'path'; import {resolve} from 'path';
import sanitize = require('sanitize-filename');
import {configFile, coreVersion} from '../../common';
const dirPath = resolve('src', 'storage', 'elasticsearch', 'templates'); const mappingsPath = resolve('node_modules', '@openstapps', 'core', 'lib','mappings');
export const aggregationsPath = resolve(dirPath, sanitize(`${coreVersion}-aggregations.json`, {replacement: '-'}));
const templateErrorPath = resolve(dirPath, sanitize(`${coreVersion}-template-[type].error.json`, {replacement: '-'}));
const aggregationsErrorPath = resolve(dirPath, sanitize(`${coreVersion}-aggregations.error.json`, {replacement: '-'}));
const errorReportPath = resolve(dirPath, sanitize(`${coreVersion}-error-report.txt`, {replacement: '-'}));
/** export const mappings = JSON.parse(readFileSync(resolve(mappingsPath, 'mappings.json'), 'utf-8')) as ElasticsearchTemplateCollection;
* Check if the correct template exists export const aggregations = JSON.parse(readFileSync(resolve(mappingsPath, 'aggregations.json'), 'utf-8')) as AggregationSchema;
*/
export function checkESTemplate(forceUpdate: boolean) {
// as the forced mapping update is only meant for development, print a warning if it is enabled
if (forceUpdate) {
Logger.warn('CAUTION: Force update of the mapping files is enabled. This causes the backend to ignore' +
' existing mapping files on start.');
}
// we don't exactly know which files are there, so we just check if the aggregations exist
// for the current core version
if (forceUpdate || !existsSync(aggregationsPath)) {
Logger.info(`No mapping for Core version ${coreVersion} found, starting automatic mapping generation. ` +
`This may take a while.`);
const map = generateTemplate(getProjectReflection(resolve('node_modules', '@openstapps', 'core', 'src')),
configFile.backend.mappingIgnoredTags, false);
if (map.errors.length > 0) {
for (const type of Object.keys(map.mappings)) {
writeFileSync(getTemplatePath(Object.keys(map.mappings[type].mappings)[0] as SCThingType, true),
// tslint:disable-next-line:no-magic-numbers
JSON.stringify(map.mappings[type], null, 2));
}
// tslint:disable-next-line:no-magic-numbers
writeFileSync(aggregationsErrorPath, JSON.stringify(map.aggregations, null, 2));
writeFileSync(errorReportPath, `ERROR REPORT FOR CORE VERSION ${coreVersion}\n${map.errors.join('\n')}`);
void Logger.error(`There were errors while generating the template, and the backend cannot continue. A list of ` +
`all errors can be found at ${errorReportPath}. To resolve this` +
` issue by hand you can go to "${templateErrorPath}" and "${aggregationsErrorPath}", then correct the issues` +
` manually and move the files to the template paths and "${aggregationsPath}" respectively.`);
process.exit(1);
} else {
Logger.ok('Mapping files were generated successfully.');
for (const type of Object.keys(map.mappings)) {
writeFileSync(getTemplatePath(Object.keys(map.mappings[type].mappings)[0] as SCThingType, false),
// tslint:disable-next-line:no-magic-numbers
JSON.stringify(map.mappings[type], null, 2));
}
writeFileSync(aggregationsPath, JSON.stringify(map.aggregations));
}
} else {
Logger.info(`Using existing mappings for core version ${coreVersion}`);
}
}
/**
* Generates the path to the template of an SCThingType
*
* @param type the type for the path
* @param error whether an error occurred in the file
*/
function getTemplatePath(type: SCThingType, error = false): string {
return resolve(dirPath, sanitize(`${coreVersion}-template-${type}${error ? '.error' : ''}.json`, {replacement: '-'}));
}
/** /**
* Re-applies all interfaces for every type * Re-applies all interfaces for every type
@@ -107,13 +48,10 @@ export async function refreshAllTemplates(client: Client) {
* @param client An elasticsearch client to use * @param client An elasticsearch client to use
*/ */
export async function putTemplate(client: Client, type: SCThingType) { export async function putTemplate(client: Client, type: SCThingType) {
let out = type.toLowerCase(); const sanitizedType = `template_${type.replace(/\s/g, '_')}`;
while (out.includes(' ')) {
out = out.replace(' ', '_');
}
return client.indices.putTemplate({ return client.indices.putTemplate({
body: JSON.parse((await readFile(getTemplatePath(type), 'utf8')).toString()), body: mappings[sanitizedType],
name: `template_${out}`, name: sanitizedType,
}); });
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright (C) 2019 StApps * Copyright (C) 2019-2021 StApps
* This program is free software: you can redistribute it and/or modify * This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as * it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the * published by the Free Software Foundation, either version 3 of the
@@ -13,15 +13,8 @@
* You should have received a copy of the GNU Affero General Public License * You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <https://www.gnu.org/licenses/>. * along with this program. If not, see <https://www.gnu.org/licenses/>.
*/ */
import {SCThingType} from '@openstapps/core'; import {SCThing, SCThingType} from '@openstapps/core';
import {SCThing} from '@openstapps/core';
import {
ESAggMatchAllFilter,
ESAggTypeFilter, ESNestedAggregation,
ESTermsFilter,
} from '@openstapps/core-tools/lib/mappings/aggregation-definitions';
// we only have the @types package because some things type definitions are still missing from the official // we only have the @types package because some things type definitions are still missing from the official
// @elastic/elasticsearch package
// tslint:disable-next-line:no-implicit-dependencies // tslint:disable-next-line:no-implicit-dependencies
import {NameList} from 'elasticsearch'; import {NameList} from 'elasticsearch';
// tslint:disable-next-line:no-implicit-dependencies // tslint:disable-next-line:no-implicit-dependencies
@@ -67,14 +60,6 @@ export interface BucketAggregation {
doc_count?: number; doc_count?: number;
} }
/**
* Checks if the type is a BucketAggregation
* @param agg the type to check
*/
export function isBucketAggregation(agg: BucketAggregation | number): agg is BucketAggregation {
return typeof agg !== 'number';
}
/** /**
* An aggregation that contains more aggregations nested inside * An aggregation that contains more aggregations nested inside
*/ */
@@ -90,21 +75,6 @@ export interface NestedAggregation {
[name: string]: BucketAggregation | number; [name: string]: BucketAggregation | number;
} }
/**
* Checks if the type is a NestedAggregation
* @param agg the type to check
*/
export function isNestedAggregation(agg: BucketAggregation | NestedAggregation): agg is NestedAggregation {
return typeof (agg as BucketAggregation).buckets === 'undefined';
}
/**
* An elasticsearch bucket aggregation
* @see https://www.elastic.co/guide/en/elasticsearch/reference/5.6/search-aggregations-bucket.html
*/
export interface AggregationSchema {
[aggregationName: string]: ESTermsFilter | ESNestedAggregation;
}
/** /**
* A configuration for using the Dis Max Query * A configuration for using the Dis Max Query
@@ -348,30 +318,6 @@ export type ESNumericRangeFilter = ESGenericRangeFilter<number, ESGenericRange<n
export type ESDateRangeFilter = ESGenericRangeFilter<string, ESDateRange>; export type ESDateRangeFilter = ESGenericRangeFilter<string, ESDateRange>;
export type ESRangeFilter = ESNumericRangeFilter | ESDateRangeFilter; export type ESRangeFilter = ESNumericRangeFilter | ESDateRangeFilter;
/**
* Checks if the parameter is of type ESTermsFilter
* @param agg the value to check
*/
export function isESTermsFilter(agg: ESTermsFilter | ESNestedAggregation): agg is ESTermsFilter {
return typeof (agg as ESTermsFilter).terms !== 'undefined';
}
/**
* Checks if the parameter is of type ESTermsFilter
* @param agg the value to check
*/
export function isESNestedAggregation(agg: ESTermsFilter | ESNestedAggregation): agg is ESNestedAggregation {
return typeof (agg as ESNestedAggregation).aggs !== 'undefined';
}
/**
* Checks if the parameter is of type
*
* @param filter the filter to narrow the type of
*/
export function isESAggMatchAllFilter(filter: ESAggTypeFilter | ESAggMatchAllFilter): filter is ESAggMatchAllFilter {
return filter.hasOwnProperty('match_all');
}
/** /**
* An elasticsearch type filter * An elasticsearch type filter

View File

@@ -0,0 +1,64 @@
/*
* Copyright (C) 2019-2021 StApps
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <https://www.gnu.org/licenses/>.
*/
import {
ESAggMatchAllFilter,
ESAggTypeFilter,
ESNestedAggregation,
ESTermsFilter,
// tslint:disable-next-line:no-implicit-dependencies we're just using the types here
} from '@openstapps/es-mapping-generator/src/types/aggregation';
import {BucketAggregation, NestedAggregation} from './elasticsearch';
/**
* Checks if the type is a BucketAggregation
* @param agg the type to check
*/
export function isBucketAggregation(agg: BucketAggregation | number): agg is BucketAggregation {
return typeof agg !== 'number';
}
/**
* Checks if the type is a NestedAggregation
* @param agg the type to check
*/
export function isNestedAggregation(agg: BucketAggregation | NestedAggregation): agg is NestedAggregation {
return typeof (agg as BucketAggregation).buckets === 'undefined';
}
/**
* Checks if the parameter is of type ESTermsFilter
* @param agg the value to check
*/
export function isESTermsFilter(agg: ESTermsFilter | ESNestedAggregation): agg is ESTermsFilter {
return typeof (agg as ESTermsFilter).terms !== 'undefined';
}
/**
* Checks if the parameter is of type ESTermsFilter
* @param agg the value to check
*/
export function isESNestedAggregation(agg: ESTermsFilter | ESNestedAggregation): agg is ESNestedAggregation {
return typeof (agg as ESNestedAggregation).aggs !== 'undefined';
}
/**
* Checks if the parameter is of type
*
* @param filter the filter to narrow the type of
*/
export function isESAggMatchAllFilter(filter: ESAggTypeFilter | ESAggMatchAllFilter): filter is ESAggMatchAllFilter {
return filter.hasOwnProperty('match_all');
}

View File

@@ -21,7 +21,7 @@ import {
SCNotFoundErrorResponse, SCNotFoundErrorResponse,
} from '@openstapps/core'; } from '@openstapps/core';
import {expect} from 'chai'; import {expect} from 'chai';
import {instance as book} from '@openstapps/core/test/resources/Book.1.json'; import {instance as book} from '@openstapps/core/test/resources/indexable/Book.1.json';
import {bulk, DEFAULT_TEST_TIMEOUT} from '../common'; import {bulk, DEFAULT_TEST_TIMEOUT} from '../common';
import {testApp} from '../tests-setup'; import {testApp} from '../tests-setup';

View File

@@ -17,7 +17,7 @@ import {SCThingUpdateRoute} from '@openstapps/core';
import chaiAsPromised from 'chai-as-promised'; import chaiAsPromised from 'chai-as-promised';
import {bulkStorageMock, DEFAULT_TEST_TIMEOUT} from '../common'; import {bulkStorageMock, DEFAULT_TEST_TIMEOUT} from '../common';
import {expect, use} from 'chai'; import {expect, use} from 'chai';
import {instance as book} from '@openstapps/core/test/resources/Book.1.json'; import {instance as book} from '@openstapps/core/test/resources/indexable/Book.1.json';
import {testApp} from '../tests-setup'; import {testApp} from '../tests-setup';
use(chaiAsPromised); use(chaiAsPromised);

View File

@@ -16,118 +16,9 @@
import {SCFacet, SCThingType} from '@openstapps/core'; import {SCFacet, SCThingType} from '@openstapps/core';
import {expect} from 'chai'; import {expect} from 'chai';
import {parseAggregations} from '../../../src/storage/elasticsearch/aggregations'; import {parseAggregations} from '../../../src/storage/elasticsearch/aggregations';
import {AggregationResponse, AggregationSchema} from '../../../src/storage/elasticsearch/common'; import {AggregationResponse} from '../../../src/storage/elasticsearch/types/elasticsearch';
describe('Aggregations', function () { describe('Aggregations', function () {
const schema: AggregationSchema = {
'@all': {
aggs: {
type: {
terms: {
field: 'type.raw',
size: 1000
}
}
},
filter: {
match_all: {}
}
},
'academic event': {
aggs: {
'academicTerms.acronym': {
terms: {
field: 'academicTerms.acronym.raw',
size: 1000
}
},
'catalogs.categories': {
terms: {
field: 'catalogs.categories.raw',
size: 1000
}
},
categories: {
terms: {
field: 'categories.raw',
size: 1000
}
},
'creativeWorks.keywords': {
terms: {
field: 'creativeWorks.keywords.raw',
size: 1000
}
},
majors: {
terms: {
field: 'majors.raw',
size: 1000
}
}
},
filter: {
type: {
value: 'academic event'
}
}
},
catalog: {
aggs: {
'academicTerm.acronym': {
terms: {
field: 'academicTerm.acronym.raw',
size: 1000
}
},
categories: {
terms: {
field: 'categories.raw',
size: 1000
}
},
'superCatalog.categories': {
terms: {
field: 'superCatalog.categories.raw',
size: 1000
}
},
'superCatalogs.categories': {
terms: {
field: 'superCatalogs.categories.raw',
size: 1000
}
}
},
filter: {
type: {
value: 'catalog'
}
}
},
person: {
aggs: {
'homeLocations.categories': {
terms: {
field: 'homeLocations.categories.raw',
size: 1000
}
}
},
filter: {
type: {
value: 'person'
}
}
},
fooType: {
terms: {
field: 'foo',
size: 123,
}
}
};
const aggregations: AggregationResponse = { const aggregations: AggregationResponse = {
catalog: { catalog: {
doc_count: 4, doc_count: 4,
@@ -262,19 +153,11 @@ describe('Aggregations', function () {
field: 'categories', field: 'categories',
onlyOnType: SCThingType.Catalog, onlyOnType: SCThingType.Catalog,
}, },
{ // no fooType as it doesn't appear in the aggregation schema
buckets: [
{
count: 321,
key: 'foo'
}
],
field: 'fooType'
}
]; ];
it('should parse the aggregations providing the appropriate facets', function () { it('should parse the aggregations providing the appropriate facets', function () {
const facets = parseAggregations(schema, aggregations); const facets = parseAggregations(aggregations);
expect(facets).to.be.eql(expectedFacets); expect(facets).to.be.eql(expectedFacets);
}); });

View File

@@ -18,14 +18,16 @@ import {
ESAggTypeFilter, ESAggTypeFilter,
ESNestedAggregation, ESNestedAggregation,
ESTermsFilter ESTermsFilter
} from '@openstapps/core-tools/lib/mappings/aggregation-definitions'; } from '@openstapps/es-mapping-generator/src/types/aggregation';
import { expect } from "chai"; import {expect} from "chai";
import { import {
BucketAggregation,
isBucketAggregation, isESAggMatchAllFilter, isESNestedAggregation, isESTermsFilter,
isNestedAggregation, isNestedAggregation,
NestedAggregation isBucketAggregation,
} from '../../../src/storage/elasticsearch/common'; isESTermsFilter,
isESAggMatchAllFilter,
isESNestedAggregation
} from '../../../lib/storage/elasticsearch/types/guards';
import {BucketAggregation, NestedAggregation} from '../../../src/storage/elasticsearch/types/elasticsearch';
describe('Common', function () { describe('Common', function () {
const bucketAggregation: BucketAggregation = {buckets: []}; const bucketAggregation: BucketAggregation = {buckets: []};

View File

@@ -15,8 +15,8 @@
*/ */
import {ApiResponse, Client} from '@elastic/elasticsearch'; import {ApiResponse, Client} from '@elastic/elasticsearch';
import {SCBook, SCBulkResponse, SCConfigFile, SCMessage, SCSearchQuery, SCThings, SCThingType} from '@openstapps/core'; import {SCBook, SCBulkResponse, SCConfigFile, SCMessage, SCSearchQuery, SCThings, SCThingType} from '@openstapps/core';
import {instance as book} from '@openstapps/core/test/resources/Book.1.json'; import {instance as book} from '@openstapps/core/test/resources/indexable/Book.1.json';
import {instance as message} from '@openstapps/core/test/resources/Message.1.json'; import {instance as message} from '@openstapps/core/test/resources/indexable/Message.1.json';
import {Logger} from '@openstapps/logger'; import {Logger} from '@openstapps/logger';
import {SMTP} from '@openstapps/logger/lib/smtp'; import {SMTP} from '@openstapps/logger/lib/smtp';
import {expect, use} from 'chai'; import {expect, use} from 'chai';
@@ -26,8 +26,8 @@ import mockedEnv from 'mocked-env';
import sinon from 'sinon'; import sinon from 'sinon';
import {configFile} from '../../../src/common'; import {configFile} from '../../../src/common';
import {MailQueue} from '../../../src/notification/mail-queue'; import {MailQueue} from '../../../src/notification/mail-queue';
import * as aggregations from '../../../src/storage/elasticsearch/aggregations'; import {aggregations} from '../../../src/storage/elasticsearch/templating';
import {ElasticsearchObject} from '../../../src/storage/elasticsearch/common'; import {ElasticsearchObject} from '../../../src/storage/elasticsearch/types/elasticsearch';
import {Elasticsearch} from '../../../src/storage/elasticsearch/elasticsearch'; import {Elasticsearch} from '../../../src/storage/elasticsearch/elasticsearch';
import * as Monitoring from '../../../src/storage/elasticsearch/monitoring'; import * as Monitoring from '../../../src/storage/elasticsearch/monitoring';
import * as query from '../../../src/storage/elasticsearch/query'; import * as query from '../../../src/storage/elasticsearch/query';
@@ -41,7 +41,6 @@ describe('Elasticsearch', function () {
// increase timeout for the suite // increase timeout for the suite
this.timeout(DEFAULT_TEST_TIMEOUT); this.timeout(DEFAULT_TEST_TIMEOUT);
const sandbox = sinon.createSandbox(); const sandbox = sinon.createSandbox();
let checkESTemplateStub: sinon.SinonStub = sandbox.stub(templating, 'checkESTemplate');
before(function () { before(function () {
console.log('before'); console.log('before');
@@ -195,31 +194,6 @@ describe('Elasticsearch', function () {
restore(); restore();
}); });
it('should force mapping update if related process env variable is not set', async function () {
const restore = mockedEnv({
'ES_FORCE_MAPPING_UPDATE': undefined,
});
new Elasticsearch(configFile);
expect(checkESTemplateStub.calledWith(false)).to.be.true;
// restore env variables
restore();
});
it('should force mapping update if related process env variable is set', async function () {
const restore = mockedEnv({
'ES_FORCE_MAPPING_UPDATE': 'true',
});
new Elasticsearch(configFile);
expect(checkESTemplateStub.calledWith(true)).to.be.true;
// restore env variables
restore();
});
});
describe('init', async function () { describe('init', async function () {
const sandbox = sinon.createSandbox(); const sandbox = sinon.createSandbox();
after(function () { after(function () {
@@ -407,14 +381,20 @@ describe('Elasticsearch', function () {
}); });
it('should reject if object is not found', async function () { it('should reject if object is not found', async function () {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: []}}}); sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: []}}});
return expect(es.get('123')).to.rejectedWith('found'); return expect(es.get('123')).to.rejectedWith('found');
}); });
it('should provide the thing if object is found', async function () { it('should provide the thing if object is found', async function () {
const foundObject: ElasticsearchObject<SCMessage> = {_id: '', _index: '', _score: 0, _type: '', _source: message as SCMessage}; const foundObject: ElasticsearchObject<SCMessage> = {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: [foundObject]}}}); _id: '',
_index: '',
_score: 0,
_type: '',
_source: message as SCMessage
};
sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: [foundObject]}}});
return expect(await es.get('123')).to.be.eql(message); return expect(await es.get('123')).to.be.eql(message);
}); });
@@ -435,16 +415,28 @@ describe('Elasticsearch', function () {
it('should not post if the object already exists in an index which will not be rolled over', async function () { it('should not post if the object already exists in an index which will not be rolled over', async function () {
const index = getIndex(); const index = getIndex();
const oldIndex = index.replace('foosource', 'barsource'); const oldIndex = index.replace('foosource', 'barsource');
const object: ElasticsearchObject<SCMessage> = {_id: '', _index: oldIndex, _score: 0, _type: '', _source: message as SCMessage}; const object: ElasticsearchObject<SCMessage> = {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: [object]}}}); _id: '',
_index: oldIndex,
_score: 0,
_type: '',
_source: message as SCMessage
};
sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: [object]}}});
sandbox.stub(Elasticsearch, 'getIndex').returns(index); sandbox.stub(Elasticsearch, 'getIndex').returns(index);
return expect(es.post(object._source, bulk)).to.rejectedWith('exist'); return expect(es.post(object._source, bulk)).to.rejectedWith('exist');
}); });
it('should not reject if the object already exists but in an index which will be rolled over', async function () { it('should not reject if the object already exists but in an index which will be rolled over', async function () {
const object: ElasticsearchObject<SCMessage> = {_id: '', _index: getIndex(), _score: 0, _type: '', _source: message as SCMessage}; const object: ElasticsearchObject<SCMessage> = {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: [object]}}}); _id: '',
_index: getIndex(),
_score: 0,
_type: '',
_source: message as SCMessage
};
sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: [object]}}});
// return index name with different generated UID (see getIndex method) // return index name with different generated UID (see getIndex method)
sandbox.stub(Elasticsearch, 'getIndex').returns(getIndex()); sandbox.stub(Elasticsearch, 'getIndex').returns(getIndex());
@@ -452,7 +444,7 @@ describe('Elasticsearch', function () {
}); });
it('should reject if there is an object creation error on the elasticsearch side', async function () { it('should reject if there is an object creation error on the elasticsearch side', async function () {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: []}}}); sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: []}}});
sandbox.stub(es.client, 'create').resolves({body: {created: false}}); sandbox.stub(es.client, 'create').resolves({body: {created: false}});
return expect(es.post(message as SCMessage, bulk)).to.rejectedWith('creation'); return expect(es.post(message as SCMessage, bulk)).to.rejectedWith('creation');
@@ -460,11 +452,11 @@ describe('Elasticsearch', function () {
it('should create a new object', async function () { it('should create a new object', async function () {
let caughtParam: any; let caughtParam: any;
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: []}}}); sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: []}}});
// @ts-ignore // @ts-ignore
let createStub = sandbox.stub(es.client, 'create').callsFake((param) => { let createStub = sandbox.stub(es.client, 'create').callsFake((param) => {
caughtParam = param; caughtParam = param;
return Promise.resolve({body: { created: true }}); return Promise.resolve({body: {created: true}});
}); });
await es.post(message as SCMessage, bulk); await es.post(message as SCMessage, bulk);
@@ -485,20 +477,32 @@ describe('Elasticsearch', function () {
sandbox.restore(); sandbox.restore();
}); });
it('should reject to put if the object does not already exist', async function () { it('should reject to put if the object does not already exist', async function () {
const object: ElasticsearchObject<SCMessage> = {_id: '', _index: getIndex(), _score: 0, _type: '', _source: message as SCMessage}; const object: ElasticsearchObject<SCMessage> = {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: []}}}); _id: '',
_index: getIndex(),
_score: 0,
_type: '',
_source: message as SCMessage
};
sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: []}}});
return expect(es.put(object._source)).to.rejectedWith('exist'); return expect(es.put(object._source)).to.rejectedWith('exist');
}); });
it('should update the object if it already exists', async function () { it('should update the object if it already exists', async function () {
let caughtParam: any; let caughtParam: any;
const object: ElasticsearchObject<SCMessage> = {_id: '', _index: getIndex(), _score: 0, _type: '', _source: message as SCMessage}; const object: ElasticsearchObject<SCMessage> = {
sandbox.stub(es.client, 'search').resolves({body:{hits: { hits: [object]}}}); _id: '',
_index: getIndex(),
_score: 0,
_type: '',
_source: message as SCMessage
};
sandbox.stub(es.client, 'search').resolves({body: {hits: {hits: [object]}}});
// @ts-ignore // @ts-ignore
const stubUpdate = sandbox.stub(es.client, 'update').callsFake((params) => { const stubUpdate = sandbox.stub(es.client, 'update').callsFake((params) => {
caughtParam = params; caughtParam = params;
return Promise.resolve({body: { created: true }}); return Promise.resolve({body: {created: true}});
}); });
await es.put(object._source); await es.put(object._source);
@@ -510,8 +514,20 @@ describe('Elasticsearch', function () {
describe('search', async function () { describe('search', async function () {
let es: Elasticsearch; let es: Elasticsearch;
const sandbox = sinon.createSandbox(); const sandbox = sinon.createSandbox();
const objectMessage: ElasticsearchObject<SCMessage> = {_id: '123', _index: getIndex(), _score: 0, _type: '', _source: message as SCMessage}; const objectMessage: ElasticsearchObject<SCMessage> = {
const objectBook: ElasticsearchObject<SCBook> = {_id: '321', _index: getIndex(), _score: 0, _type: '', _source: book as SCBook}; _id: '123',
_index: getIndex(),
_score: 0,
_type: '',
_source: message as SCMessage
};
const objectBook: ElasticsearchObject<SCBook> = {
_id: '321',
_index: getIndex(),
_score: 0,
_type: '',
_source: book as SCBook
};
const fakeEsAggregations = { const fakeEsAggregations = {
'@all': { '@all': {
doc_count: 17, doc_count: 17,
@@ -572,24 +588,22 @@ describe('Elasticsearch', function () {
{ {
buckets: [ buckets: [
{ {
count: 1, count: 13,
'key': 'foo' key: 'person',
}, },
{ {
count: 1, count: 4,
key: 'bar' key: 'catalog'
} }
], ],
field: 'type', field: 'type',
} }
]; ];
const parseAggregationsStub = sandbox.stub(aggregations, 'parseAggregations').returns(fakeFacets);
const {data, facets} = await es.search({}); const {data, facets} = await es.search({});
expect(data).to.be.eql([objectMessage._source, objectBook._source]); expect(data).to.be.eql([objectMessage._source, objectBook._source]);
expect(facets).to.be.eql(fakeFacets); expect(facets).to.be.eql(fakeFacets);
expect(parseAggregationsStub.calledWith(sinon.match.any, fakeEsAggregations)).to.be.true;
}); });
it('should provide pagination from params', async function () { it('should provide pagination from params', async function () {
@@ -645,7 +659,7 @@ describe('Elasticsearch', function () {
.calledWithMatch(searchStub, .calledWithMatch(searchStub,
{ {
body: { body: {
aggs: es.aggregationsSchema, aggs: aggregations,
query: fakeResponse, query: fakeResponse,
sort: fakeBuildSortResponse sort: fakeBuildSortResponse
}, },
@@ -656,4 +670,5 @@ describe('Elasticsearch', function () {
); );
}); });
}); });
});
}); });

View File

@@ -29,7 +29,6 @@ import {getTransport} from '../../common';
import { expect } from 'chai'; import { expect } from 'chai';
import sinon from 'sinon'; import sinon from 'sinon';
import cron from 'node-cron'; import cron from 'node-cron';
import * as templating from '../../../src/storage/elasticsearch/templating';
describe('Monitoring', async function () { describe('Monitoring', async function () {
const sandbox = sinon.createSandbox(); const sandbox = sinon.createSandbox();
@@ -51,7 +50,6 @@ describe('Monitoring', async function () {
transport = getTransport(true); transport = getTransport(true);
mailQueue = new MailQueue(transport); mailQueue = new MailQueue(transport);
cronScheduleStub = sandbox.stub(cron, 'schedule'); cronScheduleStub = sandbox.stub(cron, 'schedule');
sandbox.stub(templating, 'checkESTemplate');
}); });
afterEach(async function () { afterEach(async function () {
sandbox.restore(); sandbox.restore();

View File

@@ -22,15 +22,19 @@ import {
SCThingType SCThingType
} from '@openstapps/core'; } from '@openstapps/core';
import {expect} from 'chai'; import {expect} from 'chai';
import {ESDateRangeFilter, ESRangeFilter} from '../../../src/storage/elasticsearch/common';
import {ESNumericRangeFilter} from '../../../src/storage/elasticsearch/common';
import {configFile} from '../../../src/common';
import { import {
ElasticsearchConfig, ESBooleanFilter, ESGenericSort, ESGeoDistanceFilter, ESDateRangeFilter,
ESRangeFilter,
ESNumericRangeFilter,
ElasticsearchConfig,
ESBooleanFilter,
ESGenericSort,
ESGeoDistanceFilter,
ESGeoDistanceSort, ESGeoDistanceSort,
ESTermFilter, ESTermFilter,
ScriptSort ScriptSort
} from '../../../src/storage/elasticsearch/common'; } from '../../../src/storage/elasticsearch/types/elasticsearch';
import {configFile} from '../../../src/common';
import {buildBooleanFilter, buildFilter, buildQuery, buildSort} from '../../../src/storage/elasticsearch/query'; import {buildBooleanFilter, buildFilter, buildQuery, buildSort} from '../../../src/storage/elasticsearch/query';
describe('Query', function () { describe('Query', function () {
@@ -366,7 +370,7 @@ describe('Query', function () {
} }
}); });
it('should default to second scope', function() { it('should default to second scope', function () {
const filter = buildFilter({ const filter = buildFilter({
type: 'availability', type: 'availability',
arguments: { arguments: {
@@ -384,7 +388,7 @@ describe('Query', function () {
}, },
}; };
expect(filter).to.be.eql(expectedFilter); expect(filter).to.be.eql(expectedFilter);
}) });
it('should add || to dates', function () { it('should add || to dates', function () {
const filter = buildFilter({ const filter = buildFilter({

View File

@@ -1,201 +0,0 @@
/*
* Copyright (C) 2020 StApps
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <https://www.gnu.org/licenses/>.
*/
import {SCThingType} from '@openstapps/core';
import * as mapping from '@openstapps/core-tools/lib/mapping';
import {ElasticsearchTemplateCollection} from '@openstapps/core-tools/lib/mappings/mapping-definitions';
import {Logger} from '@openstapps/logger';
import {AggregationSchema} from '../../../src/storage/elasticsearch/common';
import {checkESTemplate, refreshAllTemplates} from '../../../src/storage/elasticsearch/templating';
import sinon from "sinon";
import * as path from 'path';
import * as common from '@openstapps/core-tools/lib/common';
import {expect} from 'chai';
import fs from 'fs';
import fsExtra from 'fs-extra';
import {Client} from '@elastic/elasticsearch';
describe('templating', function () {
describe('checkESTemplate', function () {
const sandbox = sinon.createSandbox();
let fakeMap: { aggregations: AggregationSchema, errors: string[], mappings: ElasticsearchTemplateCollection };
beforeEach(function () {
fakeMap = {
aggregations: {
'@all': {
aggs: {
type: {
terms: {
field: 'type.raw',
size: 1000
}
}
},
filter: {
match_all: {}
}
},
},
errors: [],
mappings: {
'template_dish': {
mappings: {
dish: {
// @ts-ignore just mock the mapping
foo: 'mapping'
}
},
settings: {
analysis: {
ducet_sort: {
filter: [
'german_phonebook'
],
tokenizer: 'keyword',
type: 'custom'
},
search_german: {
filter: [
'lowercase',
'german_stop',
'german_stemmer'
],
tokenizer: 'stapps_ngram',
type: 'custom'
}
},
max_result_window: 30000,
},
template: 'stapps_dish*'
},
'template_book': {
mappings: {
book: {
// @ts-ignore just mock the mapping
foo: 'mapping'
}
},
settings: {
analysis: {
ducet_sort: {
filter: [
'german_phonebook'
],
tokenizer: 'keyword',
type: 'custom'
},
search_german: {
filter: [
'lowercase',
'german_stop',
'german_stemmer'
],
tokenizer: 'stapps_ngram',
type: 'custom'
}
},
max_result_window: 30000,
},
template: 'stapps_book*'
}
}
}
});
afterEach(function () {
sandbox.restore();
});
it('should write new templates when "force update" is true', async function () {
sandbox.stub(Logger, 'error').resolves();
sandbox.stub(fs, 'existsSync').returns(true);
sandbox.stub(common, 'getProjectReflection');
let caughtData: any = [];
const writeFileSyncStub = sandbox.stub(fs, 'writeFileSync');
sandbox.stub(path, 'resolve').returns('/foo/bar');
sandbox.stub(mapping, 'generateTemplate').returns(fakeMap);
checkESTemplate(true);
expect(writeFileSyncStub.callCount).to.be.gt(0);
for (let i = 0; i < writeFileSyncStub.callCount; i++) {
caughtData.push(writeFileSyncStub.getCall(i).args[1]);
}
expect(caughtData).to.be.eql([
JSON.stringify(fakeMap.mappings['template_dish'], null, 2),
JSON.stringify(fakeMap.mappings['template_book'], null, 2),
JSON.stringify(fakeMap.aggregations),
]);
});
it('should not write new templates when "force update" is false', async function () {
sandbox.stub(Logger, 'error').resolves();
sandbox.stub(fs, 'existsSync').returns(true);
sandbox.stub(common, 'getProjectReflection');
const writeFileSyncStub = sandbox.stub(fs, 'writeFileSync');
sandbox.stub(path, 'resolve').returns('/foo/bar');
sandbox.stub(mapping, 'generateTemplate').returns(fakeMap);
checkESTemplate(false);
expect(writeFileSyncStub.called).to.be.false;
});
it('should terminate if there are errors in the map', async function () {
const processExitStub = sandbox.stub(process, 'exit');
const fakeMapWithErrors = {
...fakeMap,
errors: ['Foo Error']
};
sandbox.stub(Logger, 'error').resolves();
sandbox.stub(fs, 'existsSync').returns(true);
sandbox.stub(common, 'getProjectReflection');
sandbox.stub(fs, 'writeFileSync');
sandbox.stub(path, 'resolve').returns('/foo/bar');
sandbox.stub(mapping, 'generateTemplate').returns(fakeMapWithErrors);
checkESTemplate(true);
expect(processExitStub.called).to.be.true;
});
});
describe('refreshAllTemplates', async function () {
const sandbox = sinon.createSandbox();
const client = {
indices: {
putTemplate: (_template: any) => {
}
}
}
after(function () {
sandbox.restore();
});
it('should put templates for all types', async function () {
const clientPutTemplateStub = sandbox.stub(client.indices, 'putTemplate');
sandbox.stub(fsExtra, 'readFile').resolves(Buffer.from('{"foo": "file content"}', 'utf8'));
await refreshAllTemplates(client as Client);
for (const type of Object.values(SCThingType)) {
sinon.assert.calledWith(clientPutTemplateStub, {
body: {foo: 'file content'},
name: `template_${type.split(' ').join('_')}`
})
}
});
});
});