

Using selectExpr() – Convert Column to Double Typeįollowing example uses selectExpr() transformation of SataFrame on order to change the data type.ĭf3 = df.selectExpr("firstname","age","isGraduated","cast(salary as double) salary")Ĥ. |firstname|age|isGraduated|gender|salary |ģ.


In case if you wanted round the decimal value, use the round() function.įrom import col, round withColumn() – Convert String to Double Typeįirst will use PySpark DataFrame withColumn() to convert the salary column from String Type to Double Type, this withColumn() transformation takes the column name you wanted to convert as a first argument and for the second argument you need to apply the casting method cast().ĭf2 = df.withColumn("salary",df.salary.cast('double'))ĭf2 = df.withColumn("salary",df.salary.cast(DoubleType())) Note that column salary is a string type.Ģ. Spark.sql("SELECT firstname,DOUBLE(salary) as salary from CastExample") Convert String Type to Double Type Examplesįollowing are some PySpark examples that convert String Type to Double Type, In case if you wanted to convert to Float Type just replace the Double with Float.ĭf.withColumn("salary",df.salary.cast('double'))ĭf.withColumn("salary",df.salary.cast(DoubleType()))ĭf.withColumn("salary",col("salary").cast('double'))ĭf.withColumn("salary",round(df.salary.cast(DoubleType()),2))ĭf.select("firstname",col("salary").cast('double').alias("salary"))ĭf.selectExpr("firstname","cast(salary as double) salary") PySpark Tutorial For Beginners (Spark with Python) 1.
