10.08.15 restart - daplff/hiWi_cont GitHub Wiki
Time to take the next step
What is done?
- I/O from Fortran
- C++ class framwork
What's next?
- netcdf
- result validations
- direct I/O to fortran without extra copy (possible?)
- docu
- startup parameters from class (-idea! have basic/default tree, load it or a custom one, then upon wish edit entries)
- ofc., extend program for more simulator variables in C++.
Today, I should probably start with netcdf. Subtasks:
- Get familiar with netcdf: do a test datum with junk and load/store it in a test program.
- Implement, starting with Xpos.
- (later) extend to more variables.
getting familiar
So: checked out the official netcdf tutorial, pretty comprehensive: http://www.unidata.ucar.edu/software/netcdf/docs/netcdf-tutorial.html . It seems like it supports particle datastructures pretty well, in netcdf4, you can even have heirarchical groups and variable length arrays of things. So maybe a compound type for each particle? Or just a group? Probably just a group. Variable length array sounds good but cumbersome and incompressible.
Changed mind, variable length array sounds like the way to go with the number of particles, since it might actually vary. Or at least compound types for multiple variables later?
implementing, staring with (time and) xpos
Played around enough to get a bit confident, and decided to implement in a class. Class called ParticleOutputter, has function for initialisation with given output file name, and outputting single timestep. Choice fell upon unlimited dimensions in time and particle number, and we'll just see how it works. Time gets coordinates according to in-simulation time as well. Chose to use the C bindings as the C++ ones can't use all the features of netcdf4 (including multiple unlimited dimensions and variable length arrays, as well as future possibility compound types).
Taking notes on this wiki
Seems to work well.