3. Project Milestone 3: Details and Instructions - airavata-courses/TeamAlpha GitHub Wiki

Instructions for Milestone-3 execution:

PostgreSQL is needed to setup the database. The scripts are provided in db scripts folder. We need to install tomcat and run server on localhost. localhost:8080/Airavata_Remote_Job_Runner/login.htm


Installing and Setup the Databse:

  1. Open terminal

  2. sudo apt-get install postgresql-client

  3. sudo apt-get install postgresql postgresql-contrib

  4. sudo apt-get install pgadmin3

for more instructions go to: https://help.ubuntu.com/community/PostgreSQL or http://www.postgresql.org/download/macosx/ for mac

After database is setup:

  1. Check if user 'postgres' exits - sudo -u postgres psql postgres -c "select * from pg_user where usename = 'postgres';"

  2. If user postgres does not exist then create it - createuser postgres;

  3. Run the script present at - https://github.com/airavata-courses/TeamAlpha/blob/Milestone-3/Airavata_Remote_Job_Runner/db/ddl_02282016.sql

Runt the script using following coommand -

sudo -u postgres psql postgres -a -f [PATH_OF_SQL_FILE]
  1. Then run the script present at - https://github.com/airavata-courses/TeamAlpha/blob/Milestone-3/Airavata_Remote_Job_Runner/db/dml_02282016.sql

Runt the script using following coommand -

sudo -u postgres psql postgres -a -f [PATH_OF_SQL_FILE]

User Inputs : Below are the fields expected from users.

This fields are present in the file present at : https://github.com/airavata-courses/TeamAlpha/blob/Milestone-4/Airavata_Remote_Job_Runner/src/main/resources/user_input.properties

Property File: private.key.path=path to the private key for which ssh is configured

private.key.passphrase=pass phrase for the key if set

user.name=username

user.job.file.path=path to the directory where job file is kept

user.job.file.name=job file name

user.job.remotefile.path=path to the directory on the server

retry.time.interval=time interval for successive request to monitor job status(milliseconds)

default.retry.attempts=default number of attempts (changes as per the required time provided by the server)

Login details: id: airavata password: aaa


Running the Job Instruction:

  1. Go to 'localhost:8080/Airavata_Remote_Job_Runner/login.htm'

  2. Login Using Username and password provided above

  3. Click Create a Job to Submit job to Clusters

  4. If you are submitting GROMACS job, you should upload (.trp) file in first upload field and (.gro) file in second upload field(This would be handled in future to take files in any order)


Implementation Details:

Password is hashed and stored in database. We have implemented spring security for user management and for secure communication we used SSL.

To make use of SSL server.xml file of the tomcat server need to be updated. We need to add following connector to the file.\n

Creating keystore: (Windows)"%JAVA_HOME%\bin\keytool" -genkey -alias tomcat -keyalg RSA -keystore \path\to\my\keystore (Linux) $JAVA_HOME/bin/keytool -genkey -alias tomcat -keyalg RSA -keystore /path/to/my/keystore /path/to/my/keystore should be same as the one given in server.xml file.

Once logged in, the user can see all the submitted jobs list and their status. To submit a new job, user has to click on Create Job and supply the input fields and upload job file in case of PBS job The newly submitted job would be added to the data table and once it is complete, the output files (either error or output file) will be available for download. Output file is downloaded if the job executes successfully and error file would be downloaded in case the job did not execute properly.

If the user does not perform any activity after log in for over 20 mins, the session ends and the account is logged out.

Only single user session is allowed, if the user tries to log in from another system while one session is active, he would be logged out from the previous session


Future development:

Next milestone we will have database for all submitted jobs. As of now only jobs whose status is available on server are displayed in datatable. Next time we will have details of all submitted jobs since the beginning. Currently we have not added any validations for file upload. Next time we will have validations on type of file to be uploaded. The output file names are hardcoded as of now. We will append job ids for the same once we maintain database.