我有 3 个查询,它们从 3 个不同的表(带连接)中获取数据,并且它们的列名几乎相同(或者我使用 AS 关键字使它们相同).完成 3 个查询后,我想合并它们的结果,因此看起来它们来自一张表.请看下面的代码.
I am having 3 queries, which takes data from 3 different tables (with joins) and their column names are pretty much same (or I made them same by using ASkeyword). Once the 3 queries are completed, I want to combine their results, so it looks like they are coming from one table. Please have a look at the below codes.
第一个查询
SELECT Client_Portfolio.*,
Client.Name,
Provider.Name,
"One" AS Income_Type,
One.`One_Gross_Fee` AS "Gross_Fee",
One.`One_V_Fee` AS "V_Fee",
One.`One_E_Fee` AS "E_Fee",
One.`One_I_Fee` AS "I_Fee",
One.`One_Tax_Provision` AS "Tax_Provision",
One.`One_Net_Income` AS "Net_Income",
"N/A" AS VAT,
One.`Updated_Date`
FROM Client_Portfolio
INNER JOIN Portfolio ON Portfolio.`idPortfolio` = Client_Portfolio.`idPortfolio`
INNER JOIN Client ON Client.idClient = Client_Portfolio.idClient
JOIN Provider ON Provider.idProvider = Portfolio.idProvider
INNER JOIN One ON One.idPortfolio = Portfolio.idPortfolio
第二次查询
SELECT Client_Portfolio.*,
Client.Name,
Provider.Name,
"Two" AS Income_Type,
Two.`Two_Gross_Fee` AS "Gross_Fee",
Two.`Two_V_Fee` AS "V_Fee",
Two.`Two_E_Fee` AS "E_Fee",
Two.`Two_I_Fee` AS "I_Fee",
Two.`Two_Tax_Provision` AS "Tax_Provision",
Two.`Two_Net_Income` AS "Net_Income",
Two.`Two_Vat` AS VAT,
Two.`Updated_Date`
FROM Client_Portfolio
INNER JOIN Portfolio ON Portfolio.`idPortfolio` = Client_Portfolio.`idPortfolio`
INNER JOIN Client ON Client.idClient = Client_Portfolio.idClient
JOIN Provider ON Provider.idProvider = Portfolio.idProvider
INNER JOIN Two ON Two.idPortfolio = Portfolio.idPortfolio
第三次查询
SELECT Client_Portfolio.*,
Client.Name,
Provider.Name,
"Three" AS Income_Type,
Three.`Three_Gross_Fee` AS "Gross_Fee",
"N\A" AS "V_Fee",
Three.`Three_E_Fee` AS "E_Fee",
"N\A" AS "I_Fee",
Three.`Three_Tax_Provision` AS "Tax_Provision",
Three.`Three_Net_Income` AS "Net_Income",
Three.`Three_Vat` AS VAT,
Three.`Updated_Date`
FROM Client_Portfolio
INNER JOIN Portfolio ON Portfolio.`idPortfolio` = Client_Portfolio.`idPortfolio`
INNER JOIN Client ON Client.idClient = Client_Portfolio.idClient
JOIN Provider ON Provider.idProvider = Portfolio.idProvider
INNER JOIN Three ON Three.idPortfolio = Portfolio.idPortfolio
完成这些查询后,我想合并它们的结果.这意味着,2nd Query 返回的行将附加在 1st 查询返回的行之后. 第三个查询返回的行将附加在 1st 查询返回的行之后第二问.最后,我想按 Updated_Date
Once these queries are done, I want to combine their results. Which means, Rows returned by the 2nd Query will be appended after the rows returned by the 1st query. Rows returned by the 3rd query will be appended after the rows returned by the 2nd query. Finally, I want to sort the final result by Updated_Date
我该怎么做?
使用 UNION 组合查询:
SELECT one_fields FROM Client_Portfolio ...
UNION
SELECT two_fields FROM Client_Portfolio ...
UNION
SELECT three_fields FROM Client_Portfolio ...
排序可以通过在最后一个查询后附加一个 order by 子句来完成,如下所示:
Sorting can be done by appending an order by clause after the last query, as follows:
SELECT one_fields FROM Client_Portfolio ...
UNION
SELECT two_fields FROM Client_Portfolio ...
UNION
SELECT three_fields FROM Client_Portfolio ...
ORDER BY field1, field2, field3...;
注意field1、field2...可以是字段名称或字段编号(从1开始).
Note that field1, field2... can be field names or field numbers (starting from 1).
这篇关于将从不同查询返回的行附加到一个的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!
如何有效地使用窗口函数根据 N 个先前值来决定How to use windowing functions efficiently to decide next N number of rows based on N number of previous values(如何有效地使用窗口函数根据
在“GROUP BY"中重用选择表达式的结果;条款reuse the result of a select expression in the quot;GROUP BYquot; clause?(在“GROUP BY中重用选择表达式的结果;条款?)
Pyspark DataFrameWriter jdbc 函数的 ignore 选项是忽略整Does ignore option of Pyspark DataFrameWriter jdbc function ignore entire transaction or just offending rows?(Pyspark DataFrameWriter jdbc 函数的 ig
使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 Error while using INSERT INTO table ON DUPLICATE KEY, using a for loop array(使用 INSERT INTO table ON DUPLICATE KEY 时出错,使用 for 循环数组
pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合pyspark mysql jdbc load An error occurred while calling o23.load No suitable driver(pyspark mysql jdbc load 调用 o23.load 时发生错误 没有合适的
如何将 Apache Spark 与 MySQL 集成以将数据库表作为How to integrate Apache Spark with MySQL for reading database tables as a spark dataframe?(如何将 Apache Spark 与 MySQL 集成以将数据库表作为