Taosdump数据库迁移报错

taosdump -h 10.0.0.32 -P 6030 -u root -p -i /tmp/to/taos/backup 我用这条 直接把我们生产干挂了。

生产环境:3.0.0.1

备份到:43%的时候就挂了

报错:

connection 0x55c0bf43f760 is dumping out schema:41% of hmp_user_temp_data
connection 0x55c0bf43f760 is dumping out schema:42% of hmp_user_temp_data
connection 0x55c0bf43f760 is dumping out schema:43% of hmp_user_temp_data
ERROR: getTableDesColNative() LN2430, failed to run command <SELECT userid FROM hmp.hmp_user_temp_data_5259979>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
connection 0x55c0bf43f760 is dumping out schema:44% of hmp_user_temp_data
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5272731>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5269456>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5258768>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
connection 0x55c0bf43f760 is dumping out schema:45% of hmp_user_temp_data
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5271536>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5274939>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5271537>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
connection 0x55c0bf43f760 is dumping out schema:46% of hmp_user_temp_data
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5258772>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_5270996>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.hmp_user_temp_data_4061787>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
connection 0x55c0bf43f760 is dumping out schema:47% of hmp_user_temp_data

麻烦看下为什么会这样嘿

3.0.0.1版本有点太老了,请升级到最新的3.4.0.0版本再重新恢复一次吧。

1 个赞

我们3.0.0.1是几年前的,现在想把数据备份出来,但是执行taosdump 把我们服务器干挂了,现在就是说,有什么方式在3.0.0.1服务器上把数据备份出来。 目前3.0.0.1我们手动启动起来了

停掉taosd服务,将dataDir目录进行备份,然后再进行升级,再做taosdump等操作

1 个赞

-i /tmp/to/taos/backup: 这个是进行导入,写错了吧?

-o 才是导出备份啊。

1 个赞

写错了 。 执行的是:

taosdump -h 10.0.0.13 -P 6030 -u root -p -D hmp -o /tmp/to/taos/backup

小康,你好,你说的dataDir 对于3.0.0.1版本是哪个目录? 是这个目录下的么,把/var/lib/taos/下面的都备份下?:

/var/lib/taos/ # 数据存储目录(由 taos.cfg 中的 dataDir 配置)
├── db/ # 数据库数据
│ ├── vnode/ # 虚拟节点数据
│ │ ├── vnode1/
│ │ │ ├── tsdb/ # 时序数据存储
│ │ │ ├── meta/ # 元数据
│ │ │ ├── sma/ # 流式计算数据
│ │ │ └── wal/ # 写前日志
│ │ └── vnode2/
│ │ └── …
│ ├── mnode/ # 管理节点数据
│ │ ├── meta/
│ │ ├── wal/
│ │ └── …
│ └── qnode/ # 查询节点数据
├── dnode/ # 数据节点信息
│ └── dnodeEps.json # 端点信息
├── tq/ # 时序队列数据
├── stream/ # 流计算数据
├── sma/ # 状态机聚合数据
├── tmp/ # 临时文件
└── tdbctl/ # 集群控制数据

就是备份 /var/lib/taos 目录

connection 0x55c0bf43f760 is dumping out schema:99% of hmp_user_temp_data
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_5271411`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_10830`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_5269458`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size

报错原因: Avro块大小限制:表结构描述信息太大,超过了Avro单个块的大小限制

解决措施:# 增加缓冲区大小和超时时间
export TAOS_MAX_BINARY_DISPLAY_WIDTH=65535
export TAOS_SHELL_CONNECT_TIMEOUT=300
export TAOS_SHELL_QUERY_TIMEOUT=600

使用DUMP命令,指定更多参数

taosdump -u root -p your_password \
-o /backup/dump_output \
-D hmp \
–avro-block-size 104857600 \
–avro-sync-interval 100 \
–max-binary-len 65535 \
–threads 1 # 单线程,更稳定 这么干,可以么?

connection 0x55c0bf43f760 is dumping out schema:99% of hmp_user_temp_data
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_5271411`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_10830`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size
ERROR: getTableDesNative() LN2494, failed to run command <DESCRIBE hmp.`hmp_user_temp_data_5269458`>, taos: 0x55c0bf43f760, code: 0x8000000b, reason: Unable to establish connection
ERROR: createMTableAvroHeadImp() LN8236, Unable to write record to file. Message: Value too large for file block size

报错原因: Avro块大小限制:表结构描述信息太大,超过了Avro单个块的大小限制

解决措施:# 增加缓冲区大小和超时时间
export TAOS_MAX_BINARY_DISPLAY_WIDTH=65535
export TAOS_SHELL_CONNECT_TIMEOUT=300
export TAOS_SHELL_QUERY_TIMEOUT=600

使用DUMP命令,指定更多参数

taosdump -u root -p your_password \
-o /backup/dump_output \
-D hmp \
–avro-block-size 104857600 \
–avro-sync-interval 100 \
–max-binary-len 65535 \
–threads 1 # 单线程,更稳定 这么干,可以么。麻烦看下尼?

应该与线程数没有关系。关键还是你的版本太老了,taosdump 有bug。