4 – Tutorials
Argus Graphical Interface Tutorial – Individual Step-by-step instructions with companion videos
This series of tutorials takes you through each of the six programs to demonstrate their individual functionalities. We suggest you download the tutorial data (~350 megabytes) so you can follow along on your computer. You should also read the full manual either before or after going through the tutorials since the tutorials are fairly direct and step-by-step and may not give you a complete sense of what each Argus tool accomplishes or why it needs to be done.
Jump to: Dwarp | Sync | Patterns | Calibrate | Clicker | Wand | Argus for Design
1. Open Argus.
2. Open DWarp by pressing the leftmost button labeled ‘DWarp’.
3. Click the ‘Open’ button directly under the title towards the top of the window.
4. Browse to find ‘dwarp-tutorial.mp4’ included in the tutorial files directory and select it.
5. Change the ‘Camera Model’ drop down in the ‘Lens Parameters’ section to ‘GoPro Hero4 Black’.
6. Change the neighboring ‘Shooting Mode’ drop down to ‘1080p-30fps-wide (Fisheye)’.
7. Observe the entry boxes in the Lens Parameter section grey out. This is because we have chosen an omnidirectional distortion profile, better for wide modes, and pinhole distortion coefficients do not apply.
8. Click the ‘Specify’ button at the bottom of the window and browse for the directory you’d like to write the undistorted test video to. Type the name in as ‘dwarp_test_output.mp4’ or any other file name ending in ‘.mp4’.
9. Watch as your video is undistorted and written to the specified location.
1. Open Argus.
2. Open Sync by pressing the button labeled ‘Sync’.
3. Click the ‘+’ button and browse for ‘alpha.mp4’ included in the Argus test files directory. Select this file. Notice it has been added to the list toward the top of the window.
4. Do the above step for ‘beta.mp4’ and ‘sigma.mp4’.
5. Click the ‘Show waves’ button directly below the list, and wait for the audio to be ripped from the three videos.
6. Viewing the three wave files in the window which appears, notice that the beeps appear the best in the time window from 0.03 to 0.21 minutes.
7. Go back to the main window.
8. In the entry field labeled ‘Start:’ enter ‘0.03’.
9. In the entry field labeled ‘End:’ enter ‘0.21’.
10. Click the ‘Go’ and wait for the offsets to be found.
1. Open Argus.
2. Open Patterns by clicking the button labeled ‘Patterns’
3. Click the ‘Open’ button directly below the title.
4. Browse for ‘patterns-tutorial.mp4’ included in the Argus test directory and select it.
5. The defaults in the entry boxes are those for the test video and the printout we include with this suite i.e. 12 row, 9 columns, and a spacing of 0.2 between grid points.
6. Click the ‘Specify’ button towards the bottom of the window.
7. Click the ‘Go’ button and watch as the patterns are found frame by frame. Some patterns will fail to be found even in our test video. This is normal.
1. Open Argus.
2. Open Calibrate by choosing the right-most button labeled ‘Calibrate’.
3. Click the ‘Open’ button towards the top of the window.
4. Browse for ‘patterns-tutorial.pkl’ included in the Argus test data directory and select it.
5. In the ‘Options’ section, change the entry box labeled ‘Number of replications’ from 100 to 10.
6. Check the box labeled ‘Invert grid coordinates’.
7. Now click the ‘Specify’ button and choose a directory where you’d like to save your calibrations csv file. Name the file ‘test.csv’ or any other name ending with ‘.csv’.
8. Click ‘Go’ and wait as the 10 replications are finished.
9. After the process, you can take the distortion parameters you obtained and plug them into DWarp to test them. However, the parameters may not be very accurate after just 10 replications.
A video tutorial for Calibrate
1. Open Argus.
2. Click the second most right button labeled Clicker.
3. Click the plus button on the right and navigate to the video ‘alpha.mp4’ in the Argus tutorial folder. Put the offset as 0.
4. Repeat step 3 for ‘beta.mp4’ and ‘sigma.mp4’. Both of these videos also have an offset of 0.
5. Leave the resolution as the default and click ‘Go’.
6. Go to frame 10 by hitting the ‘G’ key and typing 10 into the popup window.
7. Press X to sync the windows to the same frame. Press it again to turn off sync.
8. Track one side of the visible wand for at least a hundred frames in all three videos. This can be accomplished by clicker the center of the wand for all the frames or clicking on the center of the wand in the first frame and hitting ‘A’ to use the auto-tracker. Auto-tracking can be improved by first growing your view finder at the bottom right using 7, Y, U, and I keys (in arrow configuration). You can view how well the auto-tracker is doing in the view-finder window.
9. Hit ‘O’ to bring up the options dialog.
10. Click the button labeled ‘Add track’. This track will be for the other side of the
wand.
11. Click ‘Ok’ to go back to the video windows.
12. Repeat step 8 for the other side of the wand.
14. Once your done, hit the ‘S’ button to bring up a save dialog. Pick a location and type a tag like ‘tutorial’.
1. Open Argus.
2. Click on the right-most button labeled ‘Wand’.
3. Click the first button labeled ‘Open’ beside ‘Open cameras: ‘. This button allows you to navigate for a Camera Profile .TXT file explained in detail in the run-through section. Select ‘tutorial_cam.txt’ in the Argus-Wand tutorial folder.
4. Clicker second button labeled ‘Open’ beside ‘Open Paired points: ‘. This allows you to browse for a paired points .CSV file. Select ‘tutorial-wand-xypts.csv’ in the Argus-Wand tutorial folder.
5. Set the scale as ‘0.2’.
6. Leave the optimization drop-down menus as their defaults of ‘Optimize none’.
7. Hit ‘Specify’ at the bottom and type ‘tutorial’ and pick a place on your hard drive to save the results.
8. Click ‘Go’ to see Wand reconstruct a 3D scene out of the pixel-coordinates and camera profile you provided. Notice that the reprojection errors printed last in the log window are quite high. You can improve these by adding more pixel correspondences in Clicker.
This tutorial takes you through the steps necessary to intrinsically calibrate your camera and shooting mode (i.e. figure out its distortion profile) and then remove lens distortion from your videos. The videos will be written to an MP4 with the original dimensions, frame rate, sound track, etc. While Argus-DWarp works well for most shooting modes and cameras, very wide angle videos may not fully undistort. These videos can be undistorted but you’ll lose a lot of the field of view when cropping down.
1. Print the dot grid located in the folder ‘Extras’ in the Argus package.
2. Film the dot grid for a short amount of time (anywhere from a minute to a few minutes) from various different angles and orientations in the desired shooting mode. Try your best to film the grid from various vantages while keeping the entire grid in view.
3. Upload your video to your computer.
4. Open Argus-Patterns.
3. Navigate to your video by clicking the button labeled ‘Open’.
4. Depending on how you oriented the grid relative to the camera, you may have to switch the rows and columns. The spacing between grid points should be the same as the default setting.
3. Specify a filename for the outputted Pickle file (.pkl) or simply click ‘Go’ and the filename and location will be the same as the input file.
4. Wait for Patterns to locate the grids in the video you provided.
5. After Patterns has finished, open up Argus-Calibrate.
6. Click ‘Open’ and navigate to the Pickle file you just created.
7. Specify a filename and location or again simply hit ‘Go’.
8. If you see messages about high RMSEs, stop the process and check the box that says ‘Invert grid coordinates’.
9. Wait for Argus-Calibrate to finish.
10. In the CSV file that Argus-Calibrate writes are the distortion coefficients and camera intrinsics (i.e. the distortion profile). You can plug these directly into the entry boxes of Argus-DWarp, or create a new CSV similar to the ones included in the ‘calibrations’ folder within Argus that DWarp will read upon every startup. Once you’ve plugged in these distortion coefficients, focal length, optical center, etc., use DWarp as usual to undistort your videos. Check the box labeled ‘Crop to undistorted region’ to make the video only display portions which are undistorted. This is probably the most desirable setting for those using Argus simply to remove lens distortion.
I’ve installed Argus for Python 3 but get this error message:
“Argus
Traceback (most recent call last):
File “/home/user/anaconda2/bin/Argus”, line 4, in
__import__(‘pkg_resources’).run_script(‘argus-gui==2.1.dev2’, ‘Argus’)
File “/home/user/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py”, line 744, in run_script
File “/home/user/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py”, line 1506, in run_script
File “/home/user/anaconda2/lib/python2.7/site-packages/argus_gui-2.1.dev2-py2.7.egg/EGG-INFO/scripts/Argus”, line 13, in
File “/home/user/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py”, line 1202, in resource_filename
File “/home/user/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py”, line 435, in get_provider
File “/home/user/anaconda2/lib/python2.7/site-packages/argus_gui-2.1.dev2-py2.7.egg/argus_gui/__init__.py”, line 17, in
File “/home/user/anaconda2/lib/python2.7/site-packages/argus_gui-2.1.dev2-py2.7.egg/argus_gui/graphers.py”, line 21, in
File “/home/user/anaconda2/lib/python2.7/site-packages/argus_gui-2.1.dev2-py2.7.egg/argus_gui/tools.py”, line 4, in
ImportError: No module named cv2″
Nothing works at the moment
That 2nd to last line referring to cv2 is probably the key; cv2 is the OpenCV library and the “No module named …” error indicates that it is not installed properly. Can you give us a bit more info on how your install went, what platform (Mac/Win/Linux) you’re using and so on?
Hi! I was trying to sync three videos. I saw the waves and specified the time range. I pressed Go and the log stopped after this message:
Build info:
Windows-10-10.0.14393 AMD64
Found audio from file number 1 of 3
Found audio from file number 2 of 3
Found audio from file number 3 of 3
Reading waves and displaying…
It stayed frozen like this for over 10 minutes.
I used claps to synchronize when taking videos.
Would there be any suggestion to make the log work and synchronize?
Hi Kyu,
A few questions to help narrow down the cause of this problem. First, did you observe any CPU activity after Argus froze? Second, how large of a time range did you specify? Finally, can you also try this with only two videos and see if you have any better success?
Cheers,
ty Hedrick
I get this error message after ‘Go’.
Exception in Tkinter callback
Traceback (most recent call last):
File “C:\Python27\lib\lib-tk\Tkinter.py”, line 1542, in __call__
return self.func(*args)
File “C:\Python27\lib\site-packages\argus_gui-2.1.dev2-py2.7.egg\argus_gui\gui.py”, line 874, in go
length = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))
AttributeError: ‘module’ object has no attribute ‘CAP_PROP_FRAME_COUNT’
The time frame that I chose was 0.08 to 0.10 minutes.
I tried with a longer time frame, from 0.06 to 0.12 minutes, but then (Not responding) appeared at the Argus top bar without any error message. I tried with two videos but it still gives the same error message. What would be the problem?
Thank you for helping me out with this error!
This error will depend on the version of OpenCV you are using. Old versions of OpenCV wanted cv2.cv.CV_CAP_PROP_FRAME_COUNT as a holdover from really old (pre cv2 versions). New versions of OpenCV do not support this, and want cv2.CAP_PROP_FRAME_COUNT (and similar in a few other places). Can you try updating OpenCV – it should make the error go away
Good evening,
I was trying the pattern tool with the video you guys give for the tutorial but when I try to save it says “Cannot read chosen video”. Besides that, I use the same parameters that appear in the tutorial but when I click GO it shoes me:
Build info:
Windows-10-10.0.15063 AMD64
Traceback (most recent call last):
pat = PatternFinder(rows, cols, spacing, ofile, ifile, start, stop)
self.start = int(np.floor(self.start*self.movie.get(cv2.CAP_PROP_FPS)))
AttributeError: ‘module’ object has no attribute ‘CAP_PROP_FPS’
Could you please explain to me what is it I am doing wrong or if I didn’t fully installed what was needed?
Thank you very much
I am experiencing this same exact problem. Any help would be appreciated!
The error indicates that for some reason the OpenCV module did not read the video file as expected. The first step is to check that OpenCV is properly installed; start a Python prompt and use the following commands to check your version:
>>> import cv2
>>> cv2.__version__
Argus basically expects version 3.x.x, what do you get?
Cheers,
Ty
Thanks, I got that fixed but now I’m having problems saving my points from clicker. It makes a shortcut to a file but cannot find it. Also the file is not in the location I specified or just does not exist. Any advice on that?
Good evening,
I am having problems using the clicker option. When I save the file and try to look for it in the file where I saved it it doesn’t appear.
Could you please explain to me what I am doing wrong?
Regards,
Paola Rossi
Dear Paola,
Argus won’t save the data until you use the “S” keyboard command, you should then get a message window stating that the data were not saved. Argus does not (yet) ask to save upon program exit, so it is a bit too easy to close it and lose your data. Try again and see if this fixes your problem.
Hello Tyson,
Thank you very much for the answer. I was trying to do as you told me but the issue was that even though I pressed the ‘S’ keyboard to save, it didn’t save and I couldn’t see the notification of the file being saved. At the end I re-installed it and it’s working now.
Regards,
Paola Rossi
Hello! First of all thank you very much for your software.
I have a question about the spacing between grid points in the calibration. When I measure the distance, I do it by measuring only the white space in between the dots or should I measure the distance from the center of two adjacent dots?
Thank you in advance for your help!
You should use the distance between the center of the dot.
Cheers,
Ty Hedrick
Thank you!
Hi
I just had a question. I ran the wand function and had a look at the SBA profile and the DLT outputs. Shouldn’t they have the same coefficients?
Also, I notice from both that even though camera 1 is chosen as the reference, it still has a translation and a pose value. Should it not be zero for translation and 1,0,0 for the reference camera?
I’m just trying to wrap my head around this!
Cheers
Amir
Hi Amir,
The SBA profile contains the camera intrinsics and extrinsics, the DLT coefficents convert these to an 11-parameter direct linear transformation form; this is easier for multi-camera triangulation but does not explicitly specify camera geometry and intrinsics. The base camera does get a small adjustment away from a null translation and rotation during the bundle adjustment optimization.
Cheers,
Ty
Hi Tyson
I am interested in the camera extrinsics so I was wondering if you could elaborate on the parameters in the SBA profile. It would be even better if you could indicate how to easily convert it to a rotation/translation matrix for each camera?
Thanks!
Mattias
Hi Mattias,
The extrinsics are at the end of the *-sba-profile.txt file, here’s the output from the Wand tutorial data, one row for each camera:
intrinsics…0.97431 0.07733 0.20393 0.05623 -0.63709 -0.06367 -0.11446
intrinsics…0.98324 0.07203 0.16701 0.01250 -0.40540 0.26981 -0.14359
intrinsics…0.99679 0.06391 0.03166 0.03641 -0.05673 -0.08110 -0.01142
The first 4 columns are the quaternion rotation and the last 3 are the translation, all with respect to the base camera (the last one in this case). This implies that in the data above the last line should actually be:
intrinsics…1 0 0 0 0 0 0
i.e. a unit quaternion followed by no translation. In fact, the bundle adjustment routines operate on the position and orientation of the last camera also and we don’t re-adjust for its slight change in position and orientation, we report them instead. If you want to make use of these elsewhere you may need to turn the quaternions into rotation matrices, there are many notes elsewhere on the web on how to do this. You should also be aware that the translations are always unscaled and unaligned to any alignment information passed in via Wand; these are the camera extrinsics just after bundle adjustment and before scaling and alignment.
Cheers,
Ty
Hello
Is there a method of calculating a dlt coefficients file given the extrinsics of each camera?
Thanks
Yeshara
Unfortunately there is not, though it might be possible to develop one. Argus works around this by establishing the 3D locations of the points using bundle adjustment then using the 2D –> 3D correspondences to compute the DLT coefficients.
Cheers,
Ty
Thanks, Ty!
This is the error message i get when I try and run Argus on linux, I have installed python and open CV.
Traceback (most recent call last):
File “/usr/local/bin/Argus”, line 4, in
__import__(‘pkg_resources’).run_script(‘argus-gui==2.1.dev3’, ‘Argus’)
File “/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py”, line 739, in run_script
self.require(requires)[0].run_script(script_name, ns)
File “/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py”, line 1494, in run_script
exec(code, namespace, namespace)
File “/usr/local/lib/python2.7/dist-packages/argus_gui-2.1.dev3-py2.7.egg/EGG-INFO/scripts/Argus”, line 13, in
RESOURCE_PATH = os.path.abspath(pkg_resources.resource_filename(‘argus_gui.resources’, ”))
File “/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py”, line 1197, in resource_filename
return get_provider(package_or_requirement).get_resource_filename(
File “/usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py”, line 431, in get_provider
__import__(moduleOrReq)
File “/usr/local/lib/python2.7/dist-packages/argus_gui-2.1.dev3-py2.7.egg/argus_gui/__init__.py”, line 21, in
from .sbaDriver import *
File “/usr/local/lib/python2.7/dist-packages/argus_gui-2.1.dev3-py2.7.egg/argus_gui/sbaDriver.py”, line 15, in
import sba
File “build/bdist.linux-x86_64/egg/sba/__init__.py”, line 5, in
# package placeholder
File “build/bdist.linux-x86_64/egg/sba/projections.py”, line 21, in
File “/usr/lib/python2.7/ctypes/__init__.py”, line 362, in __init__
self._handle = _dlopen(self._name, mode)
OSError: libsbaprojs.so: cannot open shared object file: No such file or directory
Hi Yash,
OK, the last line there … libsbaprojs.so: cannot open shared object file
indicates that you’re using Linux and your Python install is not finding the shared library that implements the sparse bundle adjustment routines. The easiest way around this is to copy the file to a place the operating system will definitely look for the library. Based on your debug output this is what you should do from a command prompt:
cd /usr/local/lib/python2.7/dist-packages/argus_gui-2.1.dev3-py2.7.egg/argus_gui/resources
sudo cp ./libsbaprojs.so /usr/local/lib
sudo ldconfig
Then try Argus again. Let me know by email if you encounter more troubles and we’ll debug and I’ll post the summary here.
Cheers,
Ty
Dear Ty,
Thank you so much, I managed to get Argus working on a macintosh instead of using Linux, I will try the workaround you mentioned just in case I do have to use it on Linux as well. Thank you so much
Hello,
I am unable to complete the clicker tutorial as every time I get to step 6 (go to frame 10 by hitting the ‘G’ key and typing 10 into the popup window), Argus “quits unexpectedly” as I try to go to frame 10. I’m running Argus 2.1 in python3 on a Macbook pro running OS X 10.11.6. I’ve been able to complete all other tutorials with no problems.
Hi Victoria,
As a followup to our email discussion about this: it appears that the error is popping up in Clicker only because Clicker makes use of the pyglet OpenGL library for drawing the movie windows and recent builds of pyglet are notcompatible with older MacOS versions. I tested a couple options and found an install recipe using Anaconda that I think will fix your problem, it’s the 3rd entry in: http://argus.web.unc.edu/anaconda/
Cheers,
Ty
Hi!
I’m very glad to know your program and it is perfect!
But, when I use the wand function, it prints warning like this.
Graphing and writing output files…
WARNING:py.warnings:C:\Anaconda3\lib\site-packages\argus_gui\graphers.py:322: FutureWarning: `rcond` parameter will change to the default of machine precision times “max(M, N)“ where M and N are the input matrix dimensions.
To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.
L = np.linalg.lstsq(A,B)[0]
How can I correct this problem!
Thanks!
Dear Walter,
There’s no problem to fix just yet; the text is a Warning only to communicate that in the future the way one of the numpy (numeric python) routines that Argus uses will change, and that we (the developers) might need to be aware of that. We’ll be changing Argus to acknowledge the change and suppress the Warning output, but there’s no actual problem or change to the way Argus works.
Cheers,
Ty Hedrick
Hi
I’m the user of Argus_3d and I have a question about ‘patterns’ function.
When I took a clip about checkerboard and analyzed it, but it printed no value.
I can’t calibrate the patterns and post process.
How can I do this problem?
If you need the clip, please let me know your email address.
Thanks!
Hi Walter,
The OpenCV api for chessboard recognition changed on us and Argus was out of step for a time, try uninstalling argus_gui (pip uninstall argus_gui) and reinstalling from the latest Bitbucket stable version – https://bitbucket.org/kilmoretrout/argus_gui/get/argus_gui_p3.zip
Alternatively, use a dot grid – that did not break with the API change.
Cheers,
Ty
Hi,
I am using 4 GoPro Hero5 cameras to measure rotation of two objects relative to each other in a relatively small volume (~1m^3). How do I assess if I have adequately calibrated the cameras (i.e is there a recommended rmes value). Similar question for triangulating the cameras with wand, is there a desired wand score? What does the wand score actually represent?
Thanks!
Hi Kate, To get the wand score, Argus measures the wand length, or the distance between the two wand end points. The wand score is then the coefficient of variation of wand length multiplied by 100, i.e. stdev(wand_length)/mean(wand_length) * 100. In general, wand scores 3 problematic. If you’re using a wand to get the calibration for your 4-camera system then you can judge the calibration quality from the wand score.
As a further note, for GoPro Hero5 cameras you’ll probably get the best results by using the GoPro linear mode and making sure that all the focal lengths are fixed to the same value in Argus::Wand.
Cheers,
Ty Hedrick
This software is working really well, but is there a reason cropping doesn’t seem to be available when using an omni model? The box is unavailable to check in the software, and it doesn’t work in command line either. Thanks!
Hi Wesley,
Cropping isn’t directly supported in the omnidirectional output because there’s no easy way to calculate how much “zoom” is necessary to get rid of the empty space at the margins. However, you can create the cropping effect with some experimentation. The Argus datafiles that hold the camera distortion coefficients are installed in this location on my Windows laptop “C:\Users\thedrick\Miniconda3\Lib\site-packages\argus_gui-2.1.dev3-py3.6.egg\argus_gui\resources\calibrations”. There’s one file for each camera model, e.g. “GoPro Hero4 Black.csv”. If you can find the appropriate file you can open and edit it in Microsoft Excel or any other spreadsheet program. The first numeric column (labelled “focal_length”) contains the zoom level for the omnidirectional undistortions. If you increase the number you zoom out and get more empty space, if you decrease it you’ll have less. A zoom value of between 1 and 2 is usually enough to remove the empty space entirely.
Cheers,
Ty Hedrick
Hi,
The following error occurs when I run Argus from my Mac (Catalina 10.15.3):
Traceback (most recent call last):
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/pkg_resources/__init__.py”, line 359, in get_provider
module = sys.modules[moduleOrReq]
KeyError: ‘argus_gui.resources’
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/Library/Frameworks/Python.framework/Versions/3.8/bin/argus”, line 13, in
RESOURCE_PATH = os.path.abspath(pkg_resources.resource_filename(‘argus_gui.resources’, ”))
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/pkg_resources/__init__.py”, line 1144, in resource_filename
return get_provider(package_or_requirement).get_resource_filename(
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/pkg_resources/__init__.py”, line 361, in get_provider
__import__(moduleOrReq)
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/argus_gui/__init__.py”, line 21, in
from .sbaDriver import *
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/argus_gui/sbaDriver.py”, line 15, in
import sba
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/sba/__init__.py”, line 5, in
from sba.projections import *
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/sba/projections.py”, line 30, in
_libsbaprojs = ctypes.CDLL(“libsbaprojs.dylib”)
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/ctypes/__init__.py”, line 373, in __init__
self._handle = _dlopen(self._name, mode)
OSError: dlopen(libsbaprojs.dylib, 6): no suitable image found. Did find:
file system relative paths not allowed in hardened programs
I referenced Yash’s post and copied “libsbaprojs.dylib” into /usr/local/lib, though this did not resolve the issue. Do you know how I can solve this issue? Thank you!
Hi, Kelly.
I have just added instructions to the installation instructions page specifically for Catalina. Start by uninstalling homebrew (unless you use it for anything else), then go through those instructions. Please let us know how it goes!
Brandon
Hello,
I am having trouble getting the auto-tracking to work in the Clicker GUI. After I mark the first frame and press ‘a’, nothing happens and I get this message printed to the console: ‘f0’ passed has more than 1 dimension. Can you help me understand what I am doing wrong? Thank you!
Hi Alana,
This turns out to be a bug that (I think) appeared when the output of one of the numpy functions we use changed slightly. It now requires some dimensional reduction before feeding it into a function minimizer. I’ve made the necessary changes in the default and dev_fixalign branches on github, but you’ll need to reinstall to get the fix. From the command line and after activating the appropriate environment, do:
pip uninstall argus_gui
pip install git+https://github.com/kilmoretrout/argus_gui.git@dev_fixalign
Cheers,
Ty Hedrick
Hi,
When trying to use dwarp we run into an issue. The system will not finish running and we get the error output as follows:
*** Terminating app due to uncaught exception ‘NSInvalidArgumentException’, reason: ‘-[QNSApplication _setup:]: unrecognized selector sent to instance 0x7feb5f7dc3e0’
terminating with uncaught exception of type NSException
abort() called
Any clue how to fix this?
I can’t have access to the tutorial data zip file and every individual video. I wonder if exists and alternative link or repository for achieve this ?
Thanks in advance
This should be fixed; the tutorial data and tutorial movies should be available now.