Scala抛出NullPointer异常

时间:2018-07-04 09:12:14

标签: scala apache-spark

我正在运行以下代码来使用Spark分析数据,执行代码时出现Nullpointer异常。有一个if(public class BarcodeReaderActivity extends AppCompatActivity { Button btnBarcodeReader; EditText Quantity; private ArrayList<Model> mlist; Model item; ListViewAdapter adapter; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_barcodereader); mlist= new ArrayList<Model>(); ListView lview = (ListView) findViewById(R.id.listview); adapter = new ListViewAdapter(this, mlist); QuantityRowID = findViewById(R.id.Quantity); lview.setAdapter(adapter); btnBarcodeReader = (Button)findViewById(R.id.btnBarcodeReader); adapter.notifyDataSetChanged(); final Activity activity = this; btnBarcodeReader.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { IntentIntegrator integrator = new IntentIntegrator(activity); integrator.setDesiredBarcodeFormats(IntentIntegrator.ALL_CODE_TYPES); integrator.setPrompt("Scan"); integrator.setCameraId(0); integrator.setBeepEnabled(false); integrator.setBarcodeImageEnabled(false); integrator.initiateScan(); } }); } @Override public void onActivityResult(int requestCode, int resultCode, Intent data) { IntentResult result = IntentIntegrator.parseActivityResult(requestCode, resultCode, data); if(result != null) { if(result.getContents() == null) { Log.d("MainActivity", "Cancelled scan"); Toast.makeText(this, "Cancelled", Toast.LENGTH_LONG).show(); } else { Log.d("MainActivity", "Scanned"); //Toast.makeText(this, "Scanned: " + result.getContents(), Toast.LENGTH_LONG).show(); populateList(result.getContents()); } } else { // This is important, otherwise the result will not be passed to the fragment super.onActivityResult(requestCode, resultCode, data); QuantityRowID.requestFocus(); int val = Integer.parseInt( QuantityRowID.getText().toString() ); item.Quantity=val; QuantityRowID.setText(val); adapter.notifyDataSetChanged(); } } public void populateList(String ID){ item = new Model(ID,0); mlist.add(item); adapter.notifyDataSetChanged(); } } )条件可以过滤空数据,但仍会收到NullPointer异常。你们中的任何一个可以帮助我吗?

uPopulation != null && !uPopulation.isEmpty()

样本数据:

package com.anil.wb
import org.apache.spark.sql.SparkSession
import java.lang.Long

object WorldBankDataAnalysis {
  def main(args: Array[String]) {

    System.setProperty("hadoop.home.dir", "D:\\BigData\\Hadoop_setups\\hadoop- 
      2.5.0-cdh5.3.2")
    System.setProperty("spark.sql.warehouse.dir", 
   "file:/D:/BigData/Spark_setups/spark-2.0.2-bin-hadoop2.6/spark-warehouse")

    val spark = SparkSession.builder.appName("UrbanPopulation").master("local").getOrCreate()

    val data = spark.read.csv("D:\\WorldBankAnalysis\\World_Bank_Indicators.csv").rdd

    val result = data.map { line =>
      val uPopulation = line.getString(10).replaceAll(",", "")
      var uPopNum = 0L
      if(uPopulation.length() > 0){
        uPopNum = Long.parseLong(uPopulation)
      }          
      (uPopNum, line.getString(0))
    }
    .sortByKey(false)

    //spark.sparkContext.parallelize(Seq(result)).saveAsTextFile(args(1))
    result.foreach { println }
    spark.stop
  }
}

输出为第0和第10个字符串值:

Afghanistan,7/1/2000,0,,0,,151,11,8,"25,950,816","5,527,524",51,45,45,45,48,50,2    
Afghanistan,7/1/2001,0,,0,0,150,11,9,"26,697,430","5,771,984",50,46,45,46,48,50,2,"2,461,666,315",92
Afghanistan,7/1/2002,0,,"25,000",0,150,22,7,"27,465,525","6,025,936",49,46,46,46,48,50,2,"4,338,907,579",158    
Afghanistan,7/1/2003,0,,"200,000",0,151,25,8,"28,255,719","6,289,723",48,46,46,46,48,50,2,"4,766,127,272",169    
Afghanistan,7/1/2004,0,,"600,000",0,150,30,9,"29,068,646","6,563,700",47,46,46,46,48,50,2,"5,704,202,651",196    
Afghanistan,7/1/2005,0,,"1,200,000",1,151,33,9,"29,904,962","6,848,236",47,47,47,47,48,50,2,"6,814,753,581",228    
Afghanistan,7/1/2006,0,11,"2,520,366",2,151,24,7,"30,751,661","7,158,987",46,47,47,47,48,50,2,"7,721,931,671",251    
Afghanistan,7/1/2007,0,18,"4,668,096",2,150,29,7,"31,622,333","7,481,844",45,47,47,47,47,50,2,"9,707,373,721",307    
Afghanistan,7/1/2008,0,19,"7,898,909",2,150,32,7,"32,517,656","7,817,245",45,48,47,48,47,51,2,"11,940,296,131",367    
Afghanistan,7/1/2009,0,21,"12,000,000",3,149,34,8,"33,438,329","8,165,640",44,48,48,48,47,51,2,"14,213,670,485",425    
Afghanistan,7/1/2010,0,,"13,000,000",4,149,38,8,"34,385,068","8,527,497",44,48,48,48,46,51,2,"17,243,112,604",501

在某些国家/地区,第10个字符串值是空的。

2 个答案:

答案 0 :(得分:1)

import org.apache.spark.sql.functions.udf

val df=spark.read.csv("D:\\WorldBankAnalysis\\World_Bank_Indicators.csv")
val extractNum=udf((s:String) => s.replace(",","").toLong)
newDf=df.select(extractNum($"_c10").alias("population"),$"_c0".alias("country"))
newDf.show()

答案 1 :(得分:0)

根据该代码,我的猜测是var dataPoint = dict[wantedTimestamp]; some 行中为空。因此,line.getString(10)会抛出val uPopulation = ...,而您将永远无法进行NullPointerException检查(这是没有用的,因为uPopulation != null永远不会返回null)。

正如Ramesh的评论所提到的,您应该首先查看报告该异常的行:如果上述理论正确,那应该是replaceAll

相关问题