Dependabot GitHub

In my flailing about trying to get the Docker & WSL Publish action to, well, action, on my fork I enabled Dependabot on the fork.

It found four dependencies that it said needed to be updated:
image

I merged them all because YOLO, and by happy accident I managed to get the Docker image to build and publish to my private docker, whereupon I had to use wsld to fetch it and turn it into a WSL2 rootfs because I still can’t fix that GitHub action fully.

Long story short, I’m processing right now using my forked WSL image and ODM appears to be working just fine with those updated dependencies.
image

Perhaps enabling Dependabot on the real repo is something to consider?

I’m personally not too big of a fan of dependabot (securitybot and whatever other bot)… the project doesn’t have to be always on the latest library, and sometimes stuff breaks (sometimes in non-obvious ways). Happy to take a pull request for upgraded pip dependencies though!

I’d recommend to run https://github.com/OpenDroneMap/oats to test sufficient datasets that cover most of the code paths before doing so :+1:

1 Like

I might need your help debugging running within WSL with the test-suite, but I think other than a specific issue with wget and paths, that it is working well.

I did the ./run --all
image

It would appear that some of the tests are using wget to grab existing results/outputs from earlier in the run, and the wget command is failing with:

(from function `check_download_dataset' in file tests/build/functions.bash, line 82,
    from function `run_test' in file tests/build/functions.bash, line 25,
    in test file tests/build/mica_split_latest.bats, line 5)
     `run_test "--dsm --fast-orthophoto --split 24 --split-overlap 50" "latest"' failed with status 8  

8 Server issued an error response.

Code-block in \\wsl\ODM\root\oats\tests\build\functions.bash

check_download_dataset(){

    dataset="$1"

    if [ ! -e ./datasets/$dataset/images ] && [ ! -z $DATASET_URL ]; then

        if [ ! -e ./datasets/$dataset ]; then

            mkdir ./datasets/$dataset

        fi 

Line 82 | wget $DATASET_URL -q -O ./datasets/$dataset/download.zip

        cd ./datasets/$dataset/

        unzip ./download.zip 2>/dev/null

        rm ./download.zip

        

        # Remove top level directory if needed

        for dir in $(ls -d */); do 

            if [ "$dir" != "images/" ]; then

                mv "$dir"/* .

                rm -fr "$dir"

            fi

        done

        # Check images path

        if [ ! -e ./images ]; then

            mkdir images

            mv *.* images

        fi

        cd ../../

    fi

}

Ups, are those datasets private? Perhaps that’s the issue.

1 Like

Uhhh…

Like did I mess up permissions and I need to chmod the oats folder, or like… Can the testsuite not reach the resource it is being directed to reach via https?

I think some datasets reference private repos. So you didn’t mess up.

2 Likes

Alright, so some failures may indeed be misleading. That’s good to know.

Do you feel confident enough to let me open those PRs, or do you want me to try to analyze the results folder before I open them?

Got these folders of results:
image

To my eye, everything looked reasonable except the ortho output for odm_boruszyn_kap
image

Which is odd because the mesh looks reasonable, I think:
image

If memory serves, odm_boruszyn_kap doesn’t have GPS data, so we can expect the ortho to potentially look terrible.

2 Likes

It’s probably safe to open a PR. It would be good to process a dataset with a Micasense camera (16 bit tiffs) in case you haven’t, but otherwise looks good :+1:

2 Likes