1개 답변
- 최신
- 최다 투표
- 가장 많은 댓글
0
-
Be sure that the column name lengths don't exceed 255 characters and don't contain special characters. For more information about column requirements, see Column.
-
Check for malformed data. For example, if the column name doesn't conform to the regular expression pattern "[\u0020-\uD7FF\uE000-\uFFFD\uD800\uDC00-\uDBFF\uDFFF\t]", then the crawler doesn't work.
-
Check for columns that have a length of 0. This happens when columns in the data don't match the data format of the table.
-
If your data contains DECIMAL columns with the "(precision, scale)" format, then be sure that the scale value is less than or equal to the precision value.
-
In the schema definition of your table, be sure that the Type of each of your columns isn't longer than 131,072 bytes. For more information, see Column structure.
-
If your crawler fails with either of the following errors, then be sure that the total schema definition of your table is not larger than 1 MB:
-
"Unable to create table in Catalog"
-
"Payload size of request exceeded limit"
답변함 2년 전
관련 콘텐츠
- AWS 공식업데이트됨 일 년 전