This is the Windows app named Chinese-LLaMA-Alpaca 2 whose latest release can be downloaded as ZhongWenYangTuoDaMoXingErQiv3.1.zip. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named Chinese-LLaMA-Alpaca 2 with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start any OS OnWorks online emulator from this website, but better Windows online emulator.
- 5. From the OnWorks Windows OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application and install it.
- 7. Download Wine from your Linux distributions software repositories. Once installed, you can then double-click the app to run them with Wine. You can also try PlayOnLinux, a fancy interface over Wine that will help you install popular Windows programs and games.
Wine is a way to run Windows software on Linux, but with no Windows required. Wine is an open-source Windows compatibility layer that can run Windows programs directly on any Linux desktop. Essentially, Wine is trying to re-implement enough of Windows from scratch so that it can run all those Windows applications without actually needing Windows.
SCREENSHOTS
Ad
Chinese-LLaMA-Alpaca 2
DESCRIPTION
This project is developed based on the commercially available large model Llama-2 released by Meta. It is the second phase of the Chinese LLaMA&Alpaca large model project. The Chinese LLaMA-2 base model and the Alpaca-2 instruction fine-tuning large model are open-sourced. These models expand and optimize the Chinese vocabulary on the basis of the original Llama-2, use large-scale Chinese data for incremental pre-training, and further improve the basic semantics and command understanding of Chinese. Performance improvements. The related model supports FlashAttention-2 training, supports 4K context and can be extended up to 18K+ through the NTK method.
Features
- Expanded the new Chinese vocabulary for the Llama-2 model , and opened up the Chinese LLaMA-2 and Alpaca-2 large models
- Open source pre-training scripts and instruction fine-tuning scripts, users can further train the model as needed
- Use the CPU/GPU of a personal computer to quickly quantify and deploy large models locally
- Currently open source models: Chinese-LLaMA-2 (7B/13B), Chinese-Alpaca-2 (7B/13B) (for larger models, please refer to the first phase of the project )
- Optimized Chinese vocabulary
- Efficient attention based on FlashAttention-2
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/chinese-llama-alpaca-2.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.