讲解代码并帮助我理解记忆---**Title:** 文本序列应用实例:机器翻译 **Logo/Text:** 广州软件学院 SOFTWARE ENGINEERING INSTITUTE OF GUANGZHOU **Section Header:** □ Encoder建立过程【重点】 **List Item:** ✓ 输入层、Embedding、LSTM、输出层 **Code Block:** ```python from keras.layers import Input,LSTM,Dense,Embedding from keras.models import Model, load_model hidden_size_1 = 64 hidden_size_2 = 128 encoder_input = Input(shape = (max_length,)) x = Embedding(eng_vocab_size, hidden_size_1, mask_zero=True, name='ENG')(encoder_input) encoder_output,encoder_h,encoder_c = LSTM(hidden_size_2, return_state=True, name='Encoder')(x) encoder_state = [encoder_h,encoder_c] ```

视频信息