Apache Pig Diagnostic運(yùn)算符

2018-12-30 15:46 更新

Load 語(yǔ)句會(huì)簡(jiǎn)單地將數(shù)據(jù)加載到Apache Pig中的指定關(guān)系中。要驗(yàn)證Load語(yǔ)句的執(zhí)行,必須使用Diagnostic運(yùn)算符。Pig Latin提供四種不同類型的診斷運(yùn)算符:

  • Dump運(yùn)算符
  • Describe運(yùn)算符
  • Explanation運(yùn)算符
  • Illustration運(yùn)算符

在本章中,我們將討論P(yáng)ig Latin的Dump運(yùn)算符。

Dump運(yùn)算符

Dump 運(yùn)算符用于運(yùn)行Pig Latin語(yǔ)句,并在屏幕上顯示結(jié)果,它通常用于調(diào)試目的。

語(yǔ)法

下面給出了 Dump 運(yùn)算符的語(yǔ)法。

grunt> Dump Relation_Name

假設(shè)在HDFS中有一個(gè)包含以下內(nèi)容的文件 student_data.txt 。

001,Rajiv,Reddy,9848022337,Hyderabad
002,siddarth,Battacharya,9848022338,Kolkata
003,Rajesh,Khanna,9848022339,Delhi
004,Preethi,Agarwal,9848022330,Pune
005,Trupthi,Mohanthy,9848022336,Bhuwaneshwar
006,Archana,Mishra,9848022335,Chennai.

我們使用LOAD運(yùn)算符將它讀入關(guān)系 student ,如下所示。

grunt> student = LOAD 'hdfs://localhost:9000/pig_data/student_data.txt' 
   USING PigStorage(',')
   as ( id:int, firstname:chararray, lastname:chararray, phone:chararray, 
   city:chararray );

現(xiàn)在,使用Dump運(yùn)算符打印關(guān)系的內(nèi)容,如下所示。

grunt> Dump student

一旦執(zhí)行上述 Pig Latin 語(yǔ)句,將啟動(dòng)一個(gè)MapReduce作業(yè)以從HDFS讀取數(shù)據(jù),將產(chǎn)生以下輸出。

2015-10-01 15:05:27,642 [main]
INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 
100% complete
2015-10-01 15:05:27,652 [main]
INFO  org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:   
HadoopVersion  PigVersion  UserId    StartedAt             FinishedAt       Features             
2.6.0          0.15.0      Hadoop  2015-10-01 15:03:11  2015-10-01 05:27     UNKNOWN
                                                
Success!  
Job Stats (time in seconds):
  
JobId           job_14459_0004
Maps                 1  
Reduces              0  
MaxMapTime          n/a    
MinMapTime          n/a
AvgMapTime          n/a 
MedianMapTime       n/a
MaxReduceTime        0
MinReduceTime        0  
AvgReduceTime        0
MedianReducetime     0
Alias             student 
Feature           MAP_ONLY        
Outputs           hdfs://localhost:9000/tmp/temp580182027/tmp757878456,

Input(s): Successfully read 0 records from: "hdfs://localhost:9000/pig_data/
student_data.txt"
  
Output(s): Successfully stored 0 records in: "hdfs://localhost:9000/tmp/temp580182027/
tmp757878456"  

Counters: Total records written : 0 Total bytes written : 0 Spillable Memory Manager 
spill count : 0Total bags proactively spilled: 0 Total records proactively spilled: 0  

Job DAG: job_1443519499159_0004
  
2015-10-01 15:06:28,403 [main]
INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLau ncher - Success!
2015-10-01 15:06:28,441 [main] INFO  org.apache.pig.data.SchemaTupleBackend - 
Key [pig.schematuple] was not set... will not generate code.
2015-10-01 15:06:28,485 [main]
INFO  org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths 
to process : 1
2015-10-01 15:06:28,485 [main]
INFO  org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths
to process : 1

(1,Rajiv,Reddy,9848022337,Hyderabad)
(2,siddarth,Battacharya,9848022338,Kolkata)
(3,Rajesh,Khanna,9848022339,Delhi)
(4,Preethi,Agarwal,9848022330,Pune)
(5,Trupthi,Mohanthy,9848022336,Bhuwaneshwar)
(6,Archana,Mishra,9848022335,Chennai)


以上內(nèi)容是否對(duì)您有幫助:
在線筆記
App下載
App下載

掃描二維碼

下載編程獅App

公眾號(hào)
微信公眾號(hào)

編程獅公眾號(hào)