pfcon *FS* and *DS* plugin example on moc ppc64le direct - FNNDSC/pfcon GitHub Wiki

pfcon FS and DS plugin example on moc ppc64le direct

Abstract

This page provides instructions that interact with pfcon in a manner similar to how CUBE would. Data is uploaded to swift storage, and then a FS plugin is run on the data, followed by a DS plugin on the results.

The set of operations are:

  • pull a sample dataset from github;
  • push the dataset into swift openstorage;
  • run an FS plugin called pl-dircopy that copies data from swiftstorage and reorganizes it in a different location;
  • run a DS plugin on the results that will extract some DICOM meta data

PRECONDITIONS

  • pfurl and supporting requirements. Doing a pip install pfurl in a python virualenv should take care of everything. If, however, there are issues with CURL_OPENSSL_3, you might need to also do (for Ubuntu):
apt-get install -y libssl-dev libcurl4-openssl-dev 
  • pl-mri_convert_ppc on the Power9 machine

and check versions:

docker run --rm local/pl-mri_convert_ppc64 \
       mri_convert_ppc64.py --version 

which should reply with (at time of writing)

0.1

Detailed steps

Base directory

Make sure you are in the base directory of the ChRIS_ultron_backEnd (CUBE) repo:

git clone https://github.com/FNNDSC/pfcon.git
cd pfcon

Set convenience environment variables

export HOST_IP=$(ip route | grep -v docker | awk '{if(NF==11) print $9}')
export HOST_PORT=8000

Pull image data

Pull a sample data set that will be used in this example:

git clone https://github.com/FNNDSC/SAG-anon

Local data assumptions:

Set a convenience variable:

export DICOMDIR=$(pwd)/SAG-anon

Instantiate pfcon services

unmake ; sudo rm -fr FS; rm -fr FS ; make

PUSH data into open storage

If you have instantiated pfcon you might want to try the following in a new terminal:

cd pfcon
export HOST_IP=$(ip route | grep -v docker | awk '{if(NF==11) print $9}')
export HOST_PORT=8000
export DICOMDIR=$(pwd)/SAG-anon

PUSH

The PUSH operation relies on the command line app, swift. The app is just a pip install away

pip install swift

or

pip install python-swift

Now, use the swiftCtl.sh script to push the data

./swiftCtl.sh -A push -E dcm -D $DICOMDIR -P chris/uploads/DICOM/dataset1

VERIFY

./swiftCtl.sh

You should see a listing of files in swift storage:

chris/uploads/DICOM/dataset1/0001-1.3.12.2.1107.5.2.19.45152.2013030808110258929186035.dcm
chris/uploads/DICOM/dataset1/0002-1.3.12.2.1107.5.2.19.45152.2013030808110261698786039.dcm
chris/uploads/DICOM/dataset1/0003-1.3.12.2.1107.5.2.19.45152.2013030808110259940386037.dcm
chris/uploads/DICOM/dataset1/0004-1.3.12.2.1107.5.2.19.45152.2013030808110256555586033.dcm
...
...
chris/uploads/DICOM/dataset1/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm
chris/uploads/DICOM/dataset1/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm
chris/uploads/DICOM/dataset1/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm

Create the equivalent of an FS feed:

Using pfurl, call pfcon passing it the location of the images in storage and instructing the pl-dircopy plugin to copy these files to a new location:

Call

pfurl \
                    --verb POST --raw --http ${HOST_IP}:5005/api/v1/cmd \
                    --httpResponseBodyParse                             \
                    --jsonwrapper 'payload' --msg '
            {
    "action": "coordinate",
    "meta-compute": {
        "auid": "chris",
        "cmd": "python3 /usr/src/dircopy/dircopy.py /share/outgoing --saveinputmeta --saveoutputmeta --dir /share/incoming",
        "container": {
            "manager": {
                "app": "swarm.py",
                "env": {
                    "meta-store": "key",
                    "serviceName": "1",
                    "serviceType": "docker",
                    "shareDir": "%shareDir"
                },
                "image": "fnndsc/swarm"
            },
            "target": {
                "cmdParse": false,
                "execshell": "python3",
                "image": "fnndsc/pl-dircopy",
                "selfexec": "dircopy.py",
                "selfpath": "/usr/src/dircopy"
            }
        },
        "cpu_limit": "1000m",
        "gpu_limit": 0,
        "jid": "1",
        "memory_limit": "200Mi",
        "number_of_workers": "1",
        "service": "host",
        "threaded": true
    },
    "meta-data": {
        "localSource": {
            "path": "chris/uploads/DICOM/dataset1",
            "storageType": "swift"
        },
        "localTarget": {
            "createDir": true,
            "path": "chris/feed_1/dircopy_1/data"
        },
        "remote": {
            "key": "%meta-store"
        },
        "service": "host",
        "specialHandling": {
            "cleanup": true,
            "op": "plugin"
        },
        "transport": {
            "compress": {
                "archive": "zip",
                "cleanup": true,
                "unpack": true
            },
            "mechanism": "compress"
        }
    },
    "meta-store": {
        "key": "jid",
        "meta": "meta-compute"
    },
    "threadAction": true
} ' --quiet --jsonpprintindent 4

Receive

If you are monitoring the containers, you can, in three separate terminals, do:

# In terminal 1:
docker-compose logs --follow pfcon_service

# In terminal 2:
docker-compose logs --follow pman_service

# In terminal 3:
docker-compose logs --follow pfioh_service

If all goes well, in the pfcon log terminal you should see the tail end of a long output:

            "chris/feed_1/dircopy_1/data/0189-1.3.12.2.1107.5.2.19.45152.2013030808105517130085417.dcm",
            "chris/feed_1/dircopy_1/data/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm",
            "chris/feed_1/dircopy_1/data/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm",
            "chris/feed_1/dircopy_1/data/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm",
            "chris/feed_1/dircopy_1/data/input.meta.json",
            "chris/feed_1/dircopy_1/data/output.meta.json"
        ],
        "fullPath": "chris/feed_1/dircopy_1/data"
    }
}

Verify

Check the contents of the swift storage:

./swiftCtl.sh

and check that the following exists:

chris/feed_1/dircopy_1/data/0001-1.3.12.2.1107.5.2.19.45152.2013030808110258929186035.dcm
chris/feed_1/dircopy_1/data/0002-1.3.12.2.1107.5.2.19.45152.2013030808110261698786039.dcm
chris/feed_1/dircopy_1/data/0003-1.3.12.2.1107.5.2.19.45152.2013030808110259940386037.dcm
chris/feed_1/dircopy_1/data/0004-1.3.12.2.1107.5.2.19.45152.2013030808110256555586033.dcm
chris/feed_1/dircopy_1/data/0005-1.3.12.2.1107.5.2.19.45152.2013030808110251492986029.dcm
...
chris/feed_1/dircopy_1/data/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm
chris/feed_1/dircopy_1/data/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm
chris/feed_1/dircopy_1/data/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm
chris/feed_1/dircopy_1/data/input.meta.json
chris/feed_1/dircopy_1/data/jobStatus.json
chris/feed_1/dircopy_1/data/jobStatusSummary.json
chris/feed_1/dircopy_1/data/output.meta.json

Create the equivalent of a DS feed

We will now run this data through another plugin, pf-pfdicom_tagExtract that will extract meta data in the files and also convert the middle image to a jpg:

Call


pfurl \
                    --verb POST --raw --http ${HOST_IP}:5005/api/v1/cmd \
                    --httpResponseBodyParse                             \
                    --jsonwrapper 'payload' --msg '
            {
    "action": "coordinate",
    "meta-compute": {
        "auid": "chris",
        "cmd": "python3 /usr/src/dcm_tagExtract/dcm_tagExtract.py /share/incoming /share/outgoing --saveinputmeta --saveoutputmeta -e dcm -m m:%_nospc|-_ProtocolName.jpg -s 2 -o %PatientID-%PatientAge -t raw,json,html,dict,col,csv --useIndexhtml --threads 0 -v 5",
        "container": {
            "manager": {
                "app": "swarm.py",
                "env": {
                    "meta-store": "key",
                    "serviceName": "2",
                    "serviceType": "docker",
                    "shareDir": "%shareDir"
                },
                "image": "fnndsc/swarm"
            },
            "target": {
                "cmdParse": false,
                "execshell": "python3",
                "image": "fnndsc/pl-pfdicom_tagextract",
                "selfexec": "dcm_tagExtract.py",
                "selfpath": "/usr/src/dcm_tagExtract"
            }
        },
        "cpu_limit": "1000m",
        "gpu_limit": 0,
        "jid": "2",
        "memory_limit": "200Mi",
        "number_of_workers": "1",
        "service": "host",
        "threaded": true
    },
    "meta-data": {
        "localSource": {
            "path": "chris/feed_1/dircopy_1/data",
            "storageType": "swift"
        },
        "localTarget": {
            "createDir": true,
            "path": "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data"
        },
        "remote": {
            "key": "%meta-store"
        },
        "service": "host",
        "specialHandling": {
            "cleanup": true,
            "op": "plugin"
        },
        "transport": {
            "compress": {
                "archive": "zip",
                "cleanup": true,
                "unpack": true
            },
            "mechanism": "compress"
        }
    },
    "meta-store": {
        "key": "jid",
        "meta": "meta-compute"
    },
    "threadAction": true
} ' --quiet --jsonpprintindent 4
            

Receive

If you are monitoring the containers, you can, in three separate terminals, do:

# In terminal 1:
docker-compose logs --follow pfcon_service

# In terminal 2:
docker-compose logs --follow pman_service

# In terminal 3:
docker-compose logs --follow pfioh_service

If all goes well, in the pfcon log terminal you should see the tail end of a long output:

      "lsList": [
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-col.txt",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-csv.txt",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-dict.txt",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-raw.txt",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y.json",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/SAG-MPRAGE-220-FOV.jpg",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/index.html",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/input.meta.json",
            "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/output.meta.json"
        ],
        "fullPath": "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data"
    }
}

Verify

Check the contents of the swift storage:

./swiftCtl.sh

and check that the following exists:

chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-col.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-csv.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-dict.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-raw.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/SAG-MPRAGE-220-FOV.jpg
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/index.html
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/input.meta.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/jobStatus.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/jobStatusSummary.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/output.meta.json

Check the image and index.html

Pull the created files

Using swift, pull the resultant files from swift:

./swiftCtl.sh -A pull -P chris/feed_1/dircopy_1/pfdicom_tagextract_2/data -O pull

which should pull the files into a tree starting at ./pull

-30-