w**2 发帖数: 147 | 1 请教大牛们,如何用window functions来算出 3day moving avg。那个error msg看不
懂呢,为啥要hive context。
多谢了~
例子如下,
from pyspark.sql import Window
from pyspark.sql import SQLContext
import pyspark.sql.functions as func
Table T:
Date Num
07/01 2
07/02 3
07/03 2
07/04 2
07/05 5
07/06 6
07/07 7
sqlCtx = SQLContext(sc)
T.registerTempTable(“T”)
w = Window.partitionBy(T.Date).orderBy(T.Date).rangeBetween(-2,0)
a = (func.avg(T["Num"]).over(w))
T.select(T["Date"],T["Num"],a.alias("moving_avg"))
Error Msg:
Could not resolve window function 'avg'. Note that, using window functions
currently requires a HiveContext; | S*******e 发帖数: 525 | 2 SQLContext only supports very limited SQL functions. HiveContext supports
many functions such as what you need. Anything SQLContext supports, the
HiveContext will support.
I think you only change "from pyspark.sql import SQLContext ", to
"from pyspark.sql import HiveContext " and change "sqlCtx = SQLContext(sc)"
to "sqlCtx = HiveContext(sc)" will work (by the way, I have very limited
knowledge on python. I mainly use Java to do Spark). | w**2 发帖数: 147 | |
|