我已经创建了数据库,例如mydb".
I've created database, for example 'mydb'.
CREATE DATABASE mydb CHARACTER SET utf8 COLLATE utf8_bin;
CREATE USER 'myuser'@'%' IDENTIFIED BY PASSWORD '*HASH';
GRANT ALL ON mydb.* TO 'myuser'@'%';
GRANT ALL ON mydb TO 'myuser'@'%';
GRANT CREATE ON mydb TO 'myuser'@'%';
FLUSH PRIVILEGES;
现在我可以从任何地方登录数据库,但不能创建表.
Now i can login to database from everywhere, but can't create tables.
如何授予对该数据库和(将来)表的所有权限.我无法在mydb"数据库中创建表.我总是得到:
How to grant all privileges on that database and (in the future) tables. I can't create tables in 'mydb' database. I always get:
CREATE TABLE t (c CHAR(20) CHARACTER SET utf8 COLLATE utf8_bin);
ERROR 1142 (42000): CREATE command denied to user 'myuser'@'...' for table 't'
GRANT ALL PRIVILEGES ON mydb.* TO 'myuser'@'%' WITH GRANT OPTION;
这就是我创建超级用户"的方式权限(虽然我通常会指定一个主机).
This is how I create my "Super User" privileges (although I would normally specify a host).
虽然这个答案可以解决访问问题,WITH GRANT OPTION 创建了一个 MySQL 用户,它可以 编辑其他用户的权限.
While this answer can solve the problem of access, WITH GRANT OPTION creates a MySQL user that can edit the permissions of other users.
GRANT OPTION 权限使您能够将您自己拥有的权限授予其他用户或从其他用户中删除.
The GRANT OPTION privilege enables you to give to other users or remove from other users those privileges that you yourself possess.
出于安全原因,您不应将此类用户帐户用于公众有权访问的任何流程(即网站).建议您创建一个仅具有数据库权限的用户用于此类用途.
For security reasons, you should not use this type of user account for any process that the public will have access to (i.e. a website). It is recommended that you create a user with only database privileges for that kind of use.
这篇关于MySQL:授予**所有**数据库权限的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!
如何有效地使用窗口函数根据 N 个先前值来决定How to use windowing functions efficiently to decide next N number of rows based on N number of previous values(如何有效地使用窗口函数根据
在“GROUP BY"中重用选择表达式的结果;条款reuse the result of a select expression in the quot;GROUP BYquot; clause?(在“GROUP BY中重用选择表达式的结果;条款?)
Pyspark DataFrameWriter jdbc 函数的 ignore 选项是忽略整Does ignore option of Pyspark DataFrameWriter jdbc function ignore entire transaction or just offending rows?(Pyspark DataFrameWriter jdbc 函数的 ig
使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 Error while using INSERT INTO table ON DUPLICATE KEY, using a for loop array(使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 for 循环数组
pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合pyspark mysql jdbc load An error occurred while calling o23.load No suitable driver(pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合适的
如何将 Apache Spark 与 MySQL 集成以将数据库表作为How to integrate Apache Spark with MySQL for reading database tables as a spark dataframe?(如何将 Apache Spark 与 MySQL 集成以将数据库表作为