当前位置:网站首页>Make word2vec for Bert model

Make word2vec for Bert model

2022-06-21 14:13:00 Skinhead strength 12138

One 、 Model download

BERT Model ,GitHub Source code , Download address
https://github.com/google-research/bert
bert Model
PS:Multilingual It's a multilingual model , The last one is the Chinese model ( Only word level ) among Uncased Is to convert all letters to lowercase , and Cased Case is reserved
Download the corresponding files according to your own needs and extract them

Two 、 Environment configuration

The runnable environment has been tested :

  1. python 3.6
  2. tensorflow 1.14.0
  3. tensorflow-gpu 1.14.0
     Insert picture description here  Insert picture description here

Environment configuration

Minimalist configuration Tensorflow, No need to download separately CUDA,cuDNN

  1. Use Anaconda Create a python Version is 3.6 New virtual environment for
  2. Click on the right arrow ,open terminal
     Insert picture description here
  3. The input terminal conda install -c aaronzs tensorflow-gpu==xxx , You can also specify no version
    Wait for download to complete
  4. Enter... When finished conda list If there is cudatoolkit and cudnn It means success
  5. open pycharm, New projects , When creating a new project, select the environment just configured conda A virtual environment
     Insert picture description here
     Insert picture description here
  6. After the completion of the new project , Use terminal Enter the following command , install bert-service Support package
pip install bert-serving-client
pip install bert-serving-server

Be careful tensorflow And CUDA、python Version corresponding
 Insert picture description here
thus , All the required environments have been configured

PS: Environment configuration reference
TensorFlow-GPU The minimalist ——— No need to install cuda、cuDNN!!!
BERT First experience

3、 ... and 、Bert Simple application of model

start-up BertClient

terminal Input in

bert-serving-start -model_dir E:\Python-Project\uncased_L-12_H-768_A-12\uncased_L-12_H-768_A-12\ -num_worker=1

among ,model-dir Input the decompressed bert Model storage path , It's better to use absolute path , Not easy to make mistakes
start-up
 Insert picture description here

Only when at last “start the sink、ready、listen” The start-up is successful

It's only activated bert-serving service , Can be used bert Model

bert Simple model demo

obtain ‘dog’ The word of the vector ,1*768 dimension

from bert_serving.client import BertClient
import numpy as np
import tensorflow as tf
tf.compat.v1.logging.set_verbosity(tf.compat.v1.logging.ERROR)

client = BertClient()
x = client.encode(['dog'])
print(np.shape(x))
print(x)

 Insert picture description here

BUG Record

  1. bert The model can be started , But it got stuck in the middle , Never run to the end —— Not properly installed tensorflow、 perhaps tensorflow-gpu, In short, it is an environmental problem
  2. start-up graph Failure !—— May be tensorflow-gpu Version of the problem , At first I used 1.15 This error was published , Then downgrade to 1.14 No more error reporting after
  3. Inexplicably python Startup error !—— Maybe I thought it was on the computer python Too many versions are too messy , I rebuilt a project , Put the code copy In the past , Restart , normal !
原网站

版权声明
本文为[Skinhead strength 12138]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202221426434757.html