[DEPRECATED] ‐ HOW TO (internal note) - thomasheckmann/zxinfo-es GitHub Wiki

How to create the ZXINFO Elasticsearch instance (for ZXINFO API + WEB)

Basic requirements:

  • Docker v20
  • NodeJS v18

Creating the Elasticsearch instance for ZXInfo API requires the following steps:

  • populate mariaDB with latest ZXDB
  • create JSON for all entries
  • update JSON with pre-generated screenshots (convert from SCR to PNG/GIF)
  • calculate md5hash for all files in ZXDB (for filecheck endPoint)

Setup NodeJS environment

The programs needed to create the data files, needs to know what version of ZXDB it's building for - as well as what was the last published version.

Editing the file zxinfo-es/.env to match new version of ZXDB and make environment ready to build:

vi ~/Public/ZXINFO/zxinfo-es/.env
cd ~/Public/ZXINFO && source ./setENV.sh && echo $ZXDB_NEW

Check $ZXDB_NEW is set correct

Create new directory for generated files for this relase:

mkdir ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW

Populate mariaDB with latest ZXDB

Latest ZXDB are available on it's github page together with detailed documentation. Notice the version number for ZXDB_mysql.sql - for example 1.0.83 (we will refer to this later as ZXDB version).

Copy the files ZXDB_mysql.sql and ZXDB_help_search.sql from ZXDB. Rename ZXDB_mysql.sql to include the version, e.g. ZXDB_mysql_1.0.83.sql

Run createZXDBcontainer.sh <zxdb_version> to create containers with mariaDB populated with data for a particular version and a container for phpmyadmin to access the database.

cd ~/Public/ZXINFO/ZXDB && git pull
unzip ZXDB_mysql.sql.zip && cp ZXDB_mysql.sql ../zxinfo-db/ZXDB_mysql_$ZXDB_NEW.sql && cp scripts/ZXDB_help_search.sql ../zxinfo-db/

# STOP any running mariadb & phpMyAdmin instances
cd ~/Public/ZXINFO/zxinfo-db && ./createZXDBcontainer.sh $ZXDB_NEW && ./wait-for-mysql.sh $ZXDB_NEW && ./wait-for-mysql.sh $ZXDB_NEW

When the ZXDB mariaDB instance is ready, phpMyAdmin are available at localhost:8080

Update md5 checksum for all files

The 'assets' folder should contain all files from the following 'repositories':

  • spectrumcomputing.co.uk (spectrumcomputing.co.uk) (run syncSC.sh to syncronize)
  • WOS June 2017 archive (World of Spectrum June 2017 Mirror)
  • TOSEC 2020 Archive (TOSEC_2020)

The md5hash states from previous run are imported from the file 'tmp_downloads_$ZXDB_OLD.csv' & 'md5db_$ZXDB_OLD.csv', so only changes to downloads are processed.

Sync assets from SC website

cd ~/Public/ZXINFO/assets && ./syncSC.sh

Create folder for generated files with md5hash details

mkdir ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/md5hash

Generate new set of md5hash files

cd ~/Public/ZXINFO/zxinfo-hash-check && node index.js --output ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/md5hash

Update md5hash values with additional details from other sites

  • zx81stuff.org.uk

node update_extra_repro.js --file extra_repos/zx81stuff.co.uk.csv --output ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/

When finished the following new files are created:

  • notfound.txt - list of files not found or with errors
  • 'data/md5hash' - JSON files to be imported into Elasticsearch

Create entries

mkdir ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/entries
cd ../zxinfo-es && node --max-old-space-size=8192 create-entries-documents.js --all --output ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/

NOTE:

  • Running the create-zxinfo-documents is known to fail, but all documents are created
  • zxscreens.txt contains filenames for screenshots that needs to be converted (later step)

Mapping and import into Elasicsearch

A running instance of Elasticsearch must be running, for example use the setup supplied and if upgrading a cleanup is recommended:

If using docker

docker-compose up -d

or as an alternative, waiting for Elasticsearch to be ready

docker run -d --name zxinfo-es_$ZXDB_NEW -p 9200:9200 blacktop/elasticsearch:8.1 && ./wait-for-elasticsearch.sh localhost:9200

Import the generated JSON into Elasticsearch

(cd ./Scripts/ && ./createEntries.sh)

Convert entry screenshots and import md5hash

Screenshot references in the zxscreen.txt file (create when generation JSON documents) needs to be converted from .scr to .png/gif (.gif if flash is used). Cleanup before processing screens are recommended.

mkdir ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/screens

(cd UpdateScreens && php convert.php) && node update-new-screens.js ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/screens/ && node update-md5hash.js ~/Public/ZXINFO/zxinfo-data/release-$ZXDB_NEW/md5hash/

Text scanning (BETA)

Make sure fontFiles[] in screen_scanner.js contains all fonts.

cd ~/Public/ZXINFO/SCRTextScanner/
node index.js --all 2>info.txt
node update-textscan.js output/textscan/

PREPARE new release

Update swagger (api)

cd ../zxinfo-api-v3/public && open -a "Visual Studio Code" swagger_v3.yaml
- change update date

git add swagger_v3.yaml && git commit -m "ZXDB update" && git push

Update WebApp (Vue)

create what's new

Creating 'what's new' requires the previous ES instance, which is compared to the newly generated JSON documents. It uses the environment variables ZXDB_NEW & ZXDB_OLD

cd ../../zxinfo-es/ && node change-log.js && cp news.json ../zxinfo-vue/src/
cd ../zxinfo-vue && open -a "Visual Studio Code" src/views/Home.vue
- (update date)
git add src/news.json src/views/Home.vue && git commit -m "ZXDB update" && git push

Test new version (local)

Switch back to newly created ES instance

  • stop previous one (1.0.124)
  • start Elasticsearch (current 1.0.126)
# start API
cd ~/Public/ZXINFO/zxinfo-api-v3/
NODE_ENV=development PORT=8300 DEBUG=zxinfo-api-v3:* nodemon --ignorpublic/javascripts/config.js --exec 'yarn start'

# TEST API
cd ~/Public/ZXINFO/zxinfo-api-v3 && node --test

# start webapp
cd ~/Public/ZXINFO/zxinfo-vue/ && NODE_OPTIONS=--openssl-legacy-provider yarn run serve

Test local setup

#Test website
http://localhost:8081

#Test filecheck: sc
http://localhost:8300/v3/filecheck/78891f02d2f13f02aece7e6201d90c6fcbabcfa5670af93fc5e8c430d98332dfdc28c1945d636c3d74d6c0e2ba5b0ae7a5368f0b3a4c623857f4ad35dcd08508

# Test filecheck: ZX81 Stuff
http://localhost:8300/v3/filecheck/e38c6a71cda106282f67d97ce3e94c70c21630702f0f97ae0813a3692e0af179e44b683e270e9598985b6ef847f1f8fc7b0208dc680536c8072ba59298a9ccec

Publishing to ZXInfo

# update assets, e.g. new generated screenshots
cd ../assets && ./update_ZXINFO.sh

# export ES instance and transfer to host
cd ../zxinfo-es/Scripts/transfer && ./export_zxinfo.sh

# remote to ES host and import new ES instance
ssh [email protected]
cd es-import
./import_zxinfo.sh

# update swagger/API info (v4)
cd ~/api.zxinfo.dk/zxinfo-api-v3
git branch (check for v4)
git pull

cd ~/api.zxinfo.dk
UID_GID="$(id -u):$(id -g)" docker compose up -d --no-deps --build zxinfo-api-v3

# web app
cd ~/zxinfo.dk/zxinfo-vue
git branch (check for v4)
git pull

cd ~/zxinfo.dk/zxinfo-vue
UID_GID="$(id -u):$(id -g)" docker compose up -d --no-deps --build zxinfo-vue


# Check deployment on: https://api.zxinfo.dk/v3/ (check update date)

# IF restart is required
NODE_ENV=development PORT=8300 DEBUG=zxinfo-api-v3:* nodemon --ignorpublic/javascripts/config.js --exec npm start

## Update WebApp
ssh  -i ~/.ssh/thishost-rsync-key [email protected]
cd ./git/zxinfo-vue && git pull && docker-compose up -d --no-deps --build zxinfo-vue

UPDATE zxinfo-api-5 (BETA)

# transfer elasticsearch files to @zxinfo.dk
rsync -avz zxinfo_games*.txt -e 'ssh -i ~/.ssh/thishost-rsync-key' [email protected]:/home/docker/zxinfo-api-v5/es-import/

ssh [email protected]
cd zxinfo-api-v5
docker compose up -d zxinfo-api-v5 --build --no-deps

# attach to container
docker exec -ti zxinfo-api-v5-zxinfo-api-v5-1 sh

cd /es-import
sh import_zxinfo.sh

Clean up Elastic

Use 'elasticvue' with 'http://internal.zxinfo.dk/e/' port 80.

Avoid yellow shards.

PUT /_settings
{
  "index": {
    "number_of_replicas": 0
  }
}
ssh -i ~/.ssh/thishost-rsync-key [email protected]
screen -dr
curl -GET 'http://localhost:9200/_cat/aliases'
# NOTE the zxinfo_games alias - this should NOT be deleted
curl -GET 'http://localhost:9200/_cat/indices'
curl -XDELETE http://localhost:9200/zxinfo-20220830-131257