This is the Windows app named CPT whose latest release can be downloaded as Version2.0.zip. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named CPT with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start any OS OnWorks online emulator from this website, but better Windows online emulator.
- 5. From the OnWorks Windows OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application and install it.
- 7. Download Wine from your Linux distributions software repositories. Once installed, you can then double-click the app to run them with Wine. You can also try PlayOnLinux, a fancy interface over Wine that will help you install popular Windows programs and games.
Wine is a way to run Windows software on Linux, but with no Windows required. Wine is an open-source Windows compatibility layer that can run Windows programs directly on any Linux desktop. Essentially, Wine is trying to re-implement enough of Windows from scratch so that it can run all those Windows applications without actually needing Windows.
SCREENSHOTS
Ad
CPT
DESCRIPTION
A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation. We replace the old BERT vocabulary with a larger one of size 51271 built from the training data, in which we 1) add missing 6800+ Chinese characters (most of them are traditional Chinese characters); 2) remove redundant tokens (e.g. Chinese character tokens with ## prefix); 3) add some English tokens to reduce OOV.
Position Embeddings We extend the max_position_embeddings from 512 to 1024. We initialize the new version of models with the old version of checkpoints with vocabulary alignment. Token embeddings found in the old checkpoints are copied. And other newly added parameters are randomly initialized. We further train the new CPT & Chinese BART 50K steps with batch size 2048, max-seq-length 1024, peak learning rate 2e-5, and warmup ratio 0.1. Aiming to unify both NLU and NLG tasks, We propose a novel Chinese Pre-trained Un-balanced Transformer (CPT).
Features
- This repository contains code and checkpoints for CPT
- A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
- Transformer encoder with fully-connected self-attention, which is designed to capture the common semantic representation for both language understanding and generation
- Generation Decoder (G-Dec)
- Shallow Transformer encoder with fully-connected self-attention, which is designed for NLU tasks. The input of U-Dec is the output of S-Enc.
- G-Dec utilizes the output of S-Enc with cross-attention
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/cpt.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.