This is the Windows app named Bootstrap Your Own Latent (BYOL) whose latest release can be downloaded as 0.7.0.zip. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named Bootstrap Your Own Latent (BYOL) with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start any OS OnWorks online emulator from this website, but better Windows online emulator.
- 5. From the OnWorks Windows OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application and install it.
- 7. Download Wine from your Linux distributions software repositories. Once installed, you can then double-click the app to run them with Wine. You can also try PlayOnLinux, a fancy interface over Wine that will help you install popular Windows programs and games.
Wine is a way to run Windows software on Linux, but with no Windows required. Wine is an open-source Windows compatibility layer that can run Windows programs directly on any Linux desktop. Essentially, Wine is trying to re-implement enough of Windows from scratch so that it can run all those Windows applications without actually needing Windows.
SCREENSHOTS
Ad
Bootstrap Your Own Latent (BYOL)
DESCRIPTION
Practical implementation of an astoundingly simple method for self-supervised learning that achieves a new state-of-the-art (surpassing SimCLR) without contrastive learning and having to designate negative pairs. This repository offers a module that one can easily wrap any image-based neural network (residual network, discriminator, policy network) to immediately start benefitting from unlabelled image data. There is now new evidence that batch normalization is key to making this technique work well. A new paper has successfully replaced batch norm with group norm + weight standardization, refuting that batch statistics are needed for BYOL to work. Simply plugin your neural network, specifying (1) the image dimensions as well as (2) the name (or index) of the hidden layer, whose output is used as the latent representation used for self-supervised training.
Features
- Practical implementation of an astoundingly simple method
- Group norm + weight standardization
- Simply plugin your neural network
- BYOL does not even need the target encoder to be an exponential moving average of the online encoder
- Fetch the embeddings or the projections
- Without contrastive learning
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/bootstrap-latent-byol.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.