This is the Linux app named MegEngine whose latest release can be downloaded as MegEnginev1.13.2sourcecode.zip. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named MegEngine with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS
Ad
MegEngine
DESCRIPTION
MegEngine is a fast, scalable and easy-to-use deep learning framework with 3 key features. You can represent quantization/dynamic shape/image pre-processing and even derivation in one model. After training, just put everything into your model and inference it on any platform at ease. Speed and precision problems won't bother you anymore due to the same core inside. In training, GPU memory usage could go down to one-third at the cost of only one additional line, which enables the DTR algorithm. Gain the lowest memory usage when inferencing a model by leveraging our unique pushdown memory planner. NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+/Android 7+(CPU-Only) platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.
Features
- Inference fast and high-precision on x86/Arm/CUDA/RoCM
- Support Linux/Windows/iOS/Android/TEE
- Unified core for both training and inference
- Lowest hardware requirements helped by algorithms
- Inference efficiently on all-platform
- Save more memory and optimize speed by leveraging advanced usage
Programming Language
C++
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/megengine.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.