MySQL只加入最近一行?

时间:2010-09-01 14:19:30

标签: mysql sql join

我有一个表客户,存储customer_id,电子邮件和参考。还有一个附加表customer_data,用于存储对客户所做更改的历史记录,即当进行了更改时,插入了新行。

为了在表格中显示客户信息,需要连接两个表格,但只有customer_data中的最新一行才能加入到客户表格中。

由于查询是分页的,因此有一点限制和偏移。

如何使用MySQL执行此操作?我想我想在那里放一个DISTINCT ......

分钟的查询是这样的 -

SELECT *, CONCAT(title,' ',forename,' ',surname) AS name
FROM customer c
INNER JOIN customer_data d on c.customer_id=d.customer_id
WHERE name LIKE '%Smith%' LIMIT 10, 20

另外,我是否正确地认为我可以用这种方式使用CONCAT?

(我很欣赏INNER JOIN可能是错误的JOIN类型。我实际上不知道不同JOIN之间的区别。我现在要调查它!)

9 个答案:

答案 0 :(得分:112)

您可能需要尝试以下操作:

SELECT    CONCAT(title, ' ', forename, ' ', surname) AS name
FROM      customer c
JOIN      (
              SELECT    MAX(id) max_id, customer_id 
              FROM      customer_data 
              GROUP BY  customer_id
          ) c_max ON (c_max.customer_id = c.customer_id)
JOIN      customer_data cd ON (cd.id = c_max.max_id)
WHERE     CONCAT(title, ' ', forename, ' ', surname) LIKE '%Smith%' 
LIMIT     10, 20;

请注意,JOIN只是INNER JOIN的同义词。

测试用例:

CREATE TABLE customer (customer_id int);
CREATE TABLE customer_data (
   id int, 
   customer_id int, 
   title varchar(10),
   forename varchar(10),
   surname varchar(10)
);

INSERT INTO customer VALUES (1);
INSERT INTO customer VALUES (2);
INSERT INTO customer VALUES (3);

INSERT INTO customer_data VALUES (1, 1, 'Mr', 'Bobby', 'Smith');
INSERT INTO customer_data VALUES (2, 1, 'Mr', 'Bob', 'Smith');
INSERT INTO customer_data VALUES (3, 2, 'Mr', 'Jane', 'Green');
INSERT INTO customer_data VALUES (4, 2, 'Miss', 'Jane', 'Green');
INSERT INTO customer_data VALUES (5, 3, 'Dr', 'Jack', 'Black');

结果(没有LIMITWHERE的查询):

SELECT    CONCAT(title, ' ', forename, ' ', surname) AS name
FROM      customer c
JOIN      (
              SELECT    MAX(id) max_id, customer_id 
              FROM      customer_data 
              GROUP BY  customer_id
          ) c_max ON (c_max.customer_id = c.customer_id)
JOIN      customer_data cd ON (cd.id = c_max.max_id);

+-----------------+
| name            |
+-----------------+
| Mr Bob Smith    |
| Miss Jane Green |
| Dr Jack Black   |
+-----------------+
3 rows in set (0.00 sec)

答案 1 :(得分:69)

如果您正在处理繁重的查询,最好移动where子句中最新行的请求。它速度更快,看起来更干净。

SELECT c.*,
FROM client AS c
LEFT JOIN client_calling_history AS cch ON cch.client_id = c.client_id
WHERE
   cch.cchid = (
      SELECT MAX(cchid)
      FROM client_calling_history
      WHERE client_id = c.client_id AND cal_event_id = c.cal_event_id
   )

答案 2 :(得分:10)

假设customer_data中的自动增量列名为Id,您可以这样做:

SELECT CONCAT(title,' ',forename,' ',surname) AS name *
FROM customer c
    INNER JOIN customer_data d 
        ON c.customer_id=d.customer_id
WHERE name LIKE '%Smith%'
    AND d.ID = (
                Select Max(D2.Id)
                From customer_data As D2
                Where D2.customer_id = D.customer_id
                )
LIMIT 10, 20

答案 3 :(得分:9)

对于必须使用旧版MySQL(5.0之前版本)的任何人,您无法对此类查询进行子查询。这是我能够做到的解决方案,它看起来效果很好。

SELECT MAX(d.id), d2.*, CONCAT(title,' ',forename,' ',surname) AS name
FROM customer AS c 
LEFT JOIN customer_data as d ON c.customer_id=d.customer_id 
LEFT JOIN customer_data as d2 ON d.id=d2.id
WHERE CONCAT(title, ' ', forename, ' ', surname) LIKE '%Smith%'
GROUP BY c.customer_id LIMIT 10, 20;

基本上,这是找到数据表的最大ID,将其连接到客户,然后将数据表加入到找到的最大ID。这样做的原因是,选择组的最大值并不能保证其余数据与id匹配,除非您将其重新连接到自身。

我没有在较新版本的MySQL上测试过它,但它适用于4.0.30。

答案 4 :(得分:3)

我知道这个问题很旧,但是多年来一直受到关注,我认为它缺少一个概念,可以在类似情况下为某人提供帮助。出于完整性考虑,我在此处添加它。

如果您不能修改原始数据库架构,那么将提供很多好的答案,并且可以很好地解决问题。

但是,如果您可以,请修改您的架构,建议您在customer表中添加一个字段,该字段包含最新{{1} }记录此客户:

id

查询客户

查询尽可能轻松,快捷:

customer_data

缺点是创建或更新客户时会更加复杂。

更新客户

只要想更新客户,就在CREATE TABLE customer ( id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, current_data_id INT UNSIGNED NULL DEFAULT NULL ); CREATE TABLE customer_data ( id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, customer_id INT UNSIGNED NOT NULL, title VARCHAR(10) NOT NULL, forename VARCHAR(10) NOT NULL, surname VARCHAR(10) NOT NULL ); 表中插入一条新记录,并更新SELECT c.*, d.title, d.forename, d.surname FROM customer c INNER JOIN customer_data d on d.id = c.current_data_id WHERE ...; 记录。

customer_data

创建客户

创建客户只需要插入customer条目,然后运行相同的语句即可。

INSERT INTO customer_data (customer_id, title, forename, surname) VALUES(2, 'Mr', 'John', 'Smith');
UPDATE customer SET current_data_id = LAST_INSERT_ID() WHERE id = 2;

总结

创建/更新客户的额外复杂性可能令人生畏,但可以通过触发器轻松实现自动化。

最后,如果您使用的是ORM,这真的很容易管理。 ORM会为您自动插入值,更新ID并自动将两个表连接起来。

这是您的可变customer模型的样子:

INSERT INTO customer () VALUES ();

SET @customer_id = LAST_INSERT_ID();
INSERT INTO customer_data (customer_id, title, forename, surname) VALUES(@customer_id, 'Mr', 'John', 'Smith');
UPDATE customer SET current_data_id = LAST_INSERT_ID() WHERE id = @customer_id;

还有不可变的Customer模型,其中仅包含吸气剂:

class Customer
{
    private int id;
    private CustomerData currentData;

    public Customer(String title, String forename, String surname)
    {
        this.update(title, forename, surname);
    }

    public void update(String title, String forename, String surname)
    {
        this.currentData = new CustomerData(this, title, forename, surname);
    }

    public String getTitle()
    {
        return this.currentData.getTitle();
    }

    public String getForename()
    {
        return this.currentData.getForename();
    }

    public String getSurname()
    {
        return this.currentData.getSurname();
    }
}

答案 5 :(得分:2)

SELECT CONCAT(title,' ',forename,' ',surname) AS name * FROM customer c 
INNER JOIN customer_data d on c.id=d.customer_id WHERE name LIKE '%Smith%' 

我认为你需要改变 c.customer_id到c.id

其他更新表结构

答案 6 :(得分:0)

将实际数据记录到“ customer_data ”表中是个好主意。使用此数据,您可以根据需要从“customer_data”表中选择所有数据。

答案 7 :(得分:0)

这可能会有所帮助

How to select the most recent set of dated records from a mysql table

您可以使用子查询获取最新记录,然后加入您的客户。

答案 8 :(得分:0)

你也可以这样做

SELECT    CONCAT(title, ' ', forename, ' ', surname) AS name
FROM      customer c
LEFT JOIN  (
              SELECT * FROM  customer_data ORDER BY id DESC
          ) customer_data ON (customer_data.customer_id = c.customer_id)
GROUP BY  c.customer_id          
WHERE     CONCAT(title, ' ', forename, ' ', surname) LIKE '%Smith%' 
LIMIT     10, 20;