Pre computed OI Architecture - ISET/isetonline GitHub Wiki

Currently isetonline uses a multi-tier architecture:

< Below notes are from original design for OIs. Needs an update. >

MATLAB code to create the needed data

  • ol_dataPrep.m enumerates desired OI and Sensor files, using isetcam to create sensorImages of each pair
  • sensor parameters such as shutter time are chosen (plenty of room for experimentation)
  • single exposure, burst exposures, and bracketed are all computed
  • there is an option for camera motion simulation as a poor substitute for re-rendering the OI
  • those files, along with data for the lenses being used, are exported in JSON format
  • Optionally, the data can also be stored in a mongodb, but currently we don't use it for anything
  • As an example/sample, generated images are also run through YOLOv4 and annotated for use online
  • rgb previews are also generated using a simplistic IP

A node.js app built using react that is built into a static website using npm

  • App includes a variety of CoreUI components and a file-driven version of Ag-Grid
  • The app uses the .json files generated by prepdata.m
  • It allows the user to scroll through, sort, and filter sensorImages by parameter
  • The user can also see the effects of bracket and burst capture.
  • The user can also show/hide the results of YOLOv4 on the various sensor images.
  • The user can also download the raw OI data, sensor info, or sensorImage data

oi2sensor Compute framework

  • Using an app built with the Express framework, user access pre-compiled pieces of ISET code to allow users to experiment
  • The front-end can submit an oi Filename and a sensor object and ask for the backend to compute it
  • The server invokes a pre-compiled Matlab custom function, oi2sensor and writes out the result
  • It then (will) return the output filename
  • Right now, the server runs on a different port from the client. Before production they should be integrated

FUTURE

  • If we need to scale tie in mongodb