mysql中int(11)的列的大小是多少字节?
What is the size of column of int(11) in mysql in bytes?
以及可以存储在此列中的最大值?
And Maximum value that can be stored in this columns?
无论指定什么长度,INT 总是 4 个字节.
An INT will always be 4 bytes no matter what length is specified.
TINYINT = 1 字节(8 位)SMALLINT = 2 字节(16 位)MEDIUMINT = 3 个字节(24 位)INT = 4 字节(32 位)BIGINT = 8 字节(64 位).TINYINT = 1 byte (8 bit)SMALLINT = 2 bytes (16 bit)MEDIUMINT = 3 bytes (24 bit)INT = 4 bytes (32 bit)BIGINT = 8 bytes (64 bit).长度只是指定用mysql命令行客户端选择数据时要填充多少个字符.12345 存储为 int(3) 仍将显示为 12345,但如果将其存储为 int(10) 仍将显示为 12345,但您可以选择填充前五位数字.例如,如果您添加了 ZEROFILL,它将显示为 0000012345.
The length just specifies how many characters to pad when selecting data with the mysql command line client. 12345 stored as int(3) will still show as 12345, but if it was stored as int(10) it would still display as 12345, but you would have the option to pad the first five digits. For example, if you added ZEROFILL it would display as 0000012345.
... 最大值为 2147483647(有符号)或 4294967295(无符号)
... and the maximum value will be 2147483647 (Signed) or 4294967295 (Unsigned)
这篇关于mysql 中 int(11) 列的大小(以字节为单位)是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!
如何有效地使用窗口函数根据 N 个先前值来决定How to use windowing functions efficiently to decide next N number of rows based on N number of previous values(如何有效地使用窗口函数根据
在“GROUP BY"中重用选择表达式的结果;条款reuse the result of a select expression in the quot;GROUP BYquot; clause?(在“GROUP BY中重用选择表达式的结果;条款?)
Pyspark DataFrameWriter jdbc 函数的 ignore 选项是忽略整Does ignore option of Pyspark DataFrameWriter jdbc function ignore entire transaction or just offending rows?(Pyspark DataFrameWriter jdbc 函数的 ig
使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 Error while using INSERT INTO table ON DUPLICATE KEY, using a for loop array(使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 for 循环数组
pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合pyspark mysql jdbc load An error occurred while calling o23.load No suitable driver(pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合适的
如何将 Apache Spark 与 MySQL 集成以将数据库表作为How to integrate Apache Spark with MySQL for reading database tables as a spark dataframe?(如何将 Apache Spark 与 MySQL 集成以将数据库表作为