Getting Started
NVIDIA AI-Assisted Annotation SDK follows a client-server approach to integrate into an application. Once a user has been granted early access user can use either C++ or Python client to integrate the SDK into an existing medical imaging application.
- It officially supports following 3 platforms
-
- Linux (Ubuntu16+)
- macOS (High Sierra and above)
- Windows (Windows 10)
Installing prebuilt C++ packages
Download the binary packages for corresponding OS supported from Releases.
Linux
export version=1.0.2
wget https://github.com/NVIDIA/ai-assisted-annotation-client/releases/download/v${version}/NvidiaAIAAClient-${version}-Linux.sh
sudo sh NvidiaAIAAClient-${version}-Linux.sh --prefix=/usr/local --exclude_sub_dir --skip-license
export LD_LIBRARY_PATH=/usr/local/lib
c++ -std=c++11 -o example example.cpp -lNvidiaAIAAClient
./example
MacOS
export version=1.0.2
wget https://github.com/NVIDIA/ai-assisted-annotation-client/releases/download/v${version}/NvidiaAIAAClient-${version}-Darwin.sh
sh NvidiaAIAAClient-${version}-Darwin.sh --prefix=/usr/local --exclude_sub_dir --skip-license
c++ -std=c++11 -o example example.cpp -lNvidiaAIAAClient
./example
Windows
Download NvidiaAIAAClient-<version>
-win64.exe from Releases and Install the package. Say by-default it installs intoC:\Program Files\NvidiaAIAAClient
PATH=C:\Program Files\NvidiaAIAAClient\bin;%PATH%
cl /EHsc -I"C:\Program Files\NvidiaAIAAClient\include" example.cpp /link NvidiaAIAAClient.lib /LIBPATH:"C:\Program Files\NvidiaAIAAClient\lib"
example.exe
Following is the code snippet to start using AIAA Client APIs
// example.cpp
#include <nvidia/aiaa/client.h>
#include <iostream>
int main() {
try {
// Create AIAA Client object
nvidia::aiaa::Client client("http://my-aiaa-server.com:5000/v1");
// List all models
nvidia::aiaa::ModelList modelList = client.models();
std::cout << "Models Supported by AIAA Server: " << modelList.toJson() << std::endl;
// Get matching model for organ Spleen
nvidia::aiaa::Model model = modelList.getMatchingModel("spleen");
std::cout << "Selected AIAA Model for organ 'Spleen' is: " << model.toJson(2) << std::endl;
// More API calls can follow here...
} catch (nvidia::aiaa::exception& e) {
std::cerr << "nvidia::aiaa::exception => nvidia.aiaa.error." << e.id << "; description: " << e.name() << std::endl;
}
return 0;
}
More details on C++ Client APIs can be found client.h
CMake Support
You can also use the NvidiaAIAAClient interface target in CMake. This target populates the appropriate usage requirements for NvidiaAIAAClient_INCLUDE_DIRS
to point to the appropriate include directories and NvidiaAIAAClient_LIBRARY
for linking the necessary Libraries.
Find Package
To use this library from a CMake project, you can locate it directly with find_package() and use the namespaced imported target from the generated package configuration:
# CMakeLists.txt find_package(NvidiaAIAAClient REQUIRED) ... include_directories(${NvidiaAIAAClient_INCLUDE_DIRS}) ... target_link_libraries(foo ${NvidiaAIAAClient_LIBRARY})
The package configuration file, NvidiaAIAAClientConfig.cmake, can be used either from an install tree or directly out of the build tree.
For example, you can specify the -DNvidiaAIAAClient_DIR
option while generating the CMake targets for project foo:
$ cmake -DNvidiaAIAAClient_DIR=/user/xyz/myinstall/lib/cmake/NvidiaAIAAClient
External Project
You can achieve this by adding External Project in CMake.
# CMakeLists.txt ... ExternalProject_Add(NvidiaAIAAClient GIT_REPOSITORY https://github.com/NVIDIA/ai-assisted-annotation-client.git GIT_TAG v1.0.2 ) ... target_link_libraries(foo ${NvidiaAIAAClient_LIBRARY})