1 個回答
- 最新
- 最多得票
- 最多評論
0
-
Be sure that the column name lengths don't exceed 255 characters and don't contain special characters. For more information about column requirements, see Column.
-
Check for malformed data. For example, if the column name doesn't conform to the regular expression pattern "[\u0020-\uD7FF\uE000-\uFFFD\uD800\uDC00-\uDBFF\uDFFF\t]", then the crawler doesn't work.
-
Check for columns that have a length of 0. This happens when columns in the data don't match the data format of the table.
-
If your data contains DECIMAL columns with the "(precision, scale)" format, then be sure that the scale value is less than or equal to the precision value.
-
In the schema definition of your table, be sure that the Type of each of your columns isn't longer than 131,072 bytes. For more information, see Column structure.
-
If your crawler fails with either of the following errors, then be sure that the total schema definition of your table is not larger than 1 MB:
-
"Unable to create table in Catalog"
-
"Payload size of request exceeded limit"
已回答 2 年前
相關內容
- AWS 官方已更新 1 年前
- AWS 官方已更新 2 年前