spark-mongo连接器到csv中的HDFS

时间:2016-08-24 10:53:05

标签: mongodb apache-spark apache-spark-sql

当我选择字段并保存结果时,我使用Spark-mongo连接器(在R语言中)来查询集合:

import java.awt.Graphics;
import javax.swing.JFrame;
import javax.swing.JPanel;
import java.awt.Color;
import java.awt.event.KeyEvent;
import java.awt.event.KeyListener;

public class WalkingMan extends JPanel implements KeyListener {
    int x = 0;
    int y = 0;

    @Override
    public void paintComponent(Graphics g) { // Overide paintComponent, not paint
        super.paintComponent(g);
        g.fillOval(x, y, 150, 150);
    }

    public WalkingMan() { // Class Constructor
        setFocusable(true); // KeyListeners only work if the component is focusable
        addKeyListener(this); // Add the KeyListener implemented by this class to the instance
    }

    public void createAndShowGUI() {
        JFrame frame = new JFrame("Walking Man");
        frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
        WalkingMan game = new WalkingMan();
        frame.add(game);

        frame.setSize(1080, 720);
        frame.setVisible(true); // Call setVisible after adding the components

        game.requestFocusInWindow(); // Request focus for the panel
    }

    public static void main(String[] args) throws InterruptedException {
        new WalkingMan().createAndShowGUI();
    }

    @Override
    public void keyPressed(KeyEvent e) {
        if (e.getKeyCode() == KeyEvent.VK_RIGHT) {
            x++;
        }
        repaint();
    }

    @Override
    public void keyReleased(KeyEvent e) {

    }

    @Override
    public void keyTyped(KeyEvent e) {

    }

}

它将结果保存在带有json内容的镶木地板文件上,但我希望在hdfs中使用简单的纯文本。

¿如何在HDFS中将此数据帧解析为csv格式?   我期待:

 t2 <- sql(sqlContext, "select name,age from members");
 saveDF(t2, "hdfs://server:8020/path/res")

1 个答案:

答案 0 :(得分:1)

@Ross谢谢,这就是解决方案:

  write.df(dataframe, "hdfs://server:8000/path/hdfs", "com.databricks.spark.csv", "overwrite")