Skip to content
geeksforgeeks
  • Courses
    • DSA to Development
    • Get IBM Certification
    • Newly Launched!
      • Master Django Framework
      • Become AWS Certified
    • For Working Professionals
      • Interview 101: DSA & System Design
      • Data Science Training Program
      • JAVA Backend Development (Live)
      • DevOps Engineering (LIVE)
      • Data Structures & Algorithms in Python
    • For Students
      • Placement Preparation Course
      • Data Science (Live)
      • Data Structure & Algorithm-Self Paced (C++/JAVA)
      • Master Competitive Programming (Live)
      • Full Stack Development with React & Node JS (Live)
    • Full Stack Development
    • Data Science Program
    • All Courses
  • Tutorials
    • Data Structures & Algorithms
    • ML & Data Science
    • Interview Corner
    • Programming Languages
    • Web Development
    • CS Subjects
    • DevOps And Linux
    • School Learning
  • Practice
    • Build your AI Agent
    • GfG 160
    • Problem of the Day
    • Practice Coding Problems
    • GfG SDE Sheet
  • Contests
    • Accenture Hackathon (Ending Soon!)
    • GfG Weekly [Rated Contest]
    • Job-A-Thon Hiring Challenge
    • All Contests and Events
  • DSA
  • Practice Problems
  • Python
  • C
  • C++
  • Java
  • Courses
  • Machine Learning
  • DevOps
  • Web Development
  • System Design
  • Aptitude
  • Projects
Open In App
Next Article:
How to create an empty dataframe in Scala?
Next article icon

How to check dataframe is empty in Scala?

Last Updated : 27 Mar, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

In this article, we will learn how to check dataframe is empty or not in Scala. we can check if a DataFrame is empty by using the isEmpty method or by checking the count of rows.

Syntax:

val isEmpty = dataframe.isEmpty

OR,

val isEmpty = dataframe.count() == 0

Here's how you can do it:

Example #1: using isEmpty function

Scala
import org.apache.spark.sql.{DataFrame, SparkSession}  object DataFrameEmptyCheck {   def main(args: Array[String]): Unit = {          // Create SparkSession     val spark = SparkSession.builder()       .appName("DataFrameEmptyCheck")       .master("local[*]")       .getOrCreate()      // Sample DataFrame (replace this with      // your actual DataFrame)     val dataframe: DataFrame = spark.emptyDataFrame      // Check if DataFrame is empty     val isEmpty = dataframe.isEmpty     if (isEmpty) {       println("DataFrame is empty")     } else {       println("DataFrame is not empty")     }      // Stop SparkSession     spark.stop()   } } 

Output:

DataFrame is empty

Explanation:

  1. The code creates a SparkSession, which is the entry point to Spark functionality.
  2. It defines a sample DataFrame using spark.emptyDataFrame, which creates an empty DataFrame. You would typically replace this with your actual DataFrame.
  3. The code then checks if the DataFrame is empty using the isEmpty method. Since we initialized it as an empty DataFrame, the condition isEmpty will evaluate to true.
  4. If the DataFrame is empty, it prints "DataFrame is empty".
  5. Finally, the SparkSession is stopped to release resources.

Example #2 : using count function

Scala
import org.apache.spark.sql.{DataFrame, SparkSession}  object DataFrameEmptyCheck {   def main(args: Array[String]): Unit = {          // Create SparkSession     val spark = SparkSession.builder()       .appName("DataFrameEmptyCheck")       .master("local[*]")       .getOrCreate()      // Sample DataFrame (replace this with      // your actual DataFrame)     val dataframe: DataFrame = spark.emptyDataFrame      // Check if DataFrame is empty     val isEmpty = dataframe.count() == 0     if (isEmpty) {       println("DataFrame is empty")     } else {       println("DataFrame is not empty")     }      // Stop SparkSession     spark.stop()   } } 

Output:

DataFrame is empty

Explanation:

  1. The code creates a SparkSession, initializing it as "local[*]", which means it will run locally using all available CPU cores.
  2. It defines a sample DataFrame using spark.emptyDataFrame, creating an empty DataFrame. This DataFrame has no rows.
  3. The code then checks if the DataFrame is empty using the count() function. This function returns the number of rows in the DataFrame. Since the DataFrame is empty, its count will be 0.
  4. The condition dataframe.count() == 0 evaluates to true because the count of rows in the DataFrame is indeed 0.
  5. Therefore, it prints "DataFrame is empty" to indicate that the DataFrame is indeed empty.
  6. Finally, the SparkSession is stopped to release resources.

Next Article
How to create an empty dataframe in Scala?

R

raushanikuf9x7
Improve
Article Tags :
  • Scala

Similar Reads

  • How to Check if PySpark DataFrame is empty?
    In this article, we are going to check if the Pyspark DataFrame or Dataset is Empty or Not. At first, let's create a dataframe [GFGTABS] Python3 # import modules from pyspark.sql import SparkSession from pyspark.sql.types import StructType, StructField, StringType # defining schema schema = StructTy
    1 min read
  • How to create an empty dataframe in Scala?
    In this article, we will learn how to create an empty dataframe in Scala. We can create an empty dataframe in Scala by using the createDataFrame method provided by the SparkSession object. Syntax to create an empty DataFrame: val df = spark.emptyDataFrame Example of How to create an empty dataframe
    2 min read
  • How To Check If Cell Is Empty In Pandas Dataframe
    An empty cell or missing value in the Pandas data frame is a cell that consists of no value, even a NaN or None. It is typically used to denote undefined or missing values in numerical arrays or DataFrames. Empty cells in a DataFrame can take several forms: NaN: Represents missing or undefined data.
    6 min read
  • How to check dataframe size in Scala?
    In this article, we will learn how to check dataframe size in Scala. To check the size of a DataFrame in Scala, you can use the count() function, which returns the number of rows in the DataFrame. Here's how you can do it: Syntax: val size = dataframe.count() Example #1: [GFGTABS] Scala import org.a
    2 min read
  • How to Check Datatype in Scala?
    In this article, we will learn to check data types in Scala. Data types in Scala represent the type of values that variables can hold, aiding in type safety and program correctness. Table of Content Checking Datatype in ScalaApproach 1: Use Pattern Matching in ScalaApproach 2: Use the getClass Metho
    3 min read
  • How to print dataframe in Scala?
    Scala stands for scalable language. It was developed in 2003 by Martin Odersky. It is an object-oriented language that provides support for functional programming approach as well. Everything in scala is an object e.g. - values like 1,2 can invoke functions like toString(). Scala is a statically typ
    4 min read
  • How to Check the Schema of DataFrame in Scala?
    With DataFrames in Apache Spark using Scala, you could check the schema of a DataFrame and get to know its structure with column types. The schema contains data types and names of columns that are available in a DataFrame. Apache Spark is a powerful distributed computing framework used for processin
    3 min read
  • How to Join Two DataFrame in Scala?
    Scala stands for scalable language. It is a statically typed language although unlike other statically typed languages like C, C++, or Java, it doesn't require type information while writing the code. The type verification is done at the compile time. Static typing allows us to build safe systems by
    5 min read
  • How to create an empty PySpark DataFrame ?
    In PySpark, an empty DataFrame is one that contains no data. You might need to create an empty DataFrame for various reasons such as setting up schemas for data processing or initializing structures for later appends. In this article, we’ll explore different ways to create an empty PySpark DataFrame
    4 min read
  • How to check the schema of PySpark DataFrame?
    In this article, we are going to check the schema of pyspark dataframe. We are going to use the below Dataframe for demonstration. Method 1: Using df.schema Schema is used to return the columns along with the type. Syntax: dataframe.schema Where, dataframe is the input dataframe Code: [GFGTABS] Pyth
    2 min read
geeksforgeeks-footer-logo
Corporate & Communications Address:
A-143, 7th Floor, Sovereign Corporate Tower, Sector- 136, Noida, Uttar Pradesh (201305)
Registered Address:
K 061, Tower K, Gulshan Vivante Apartment, Sector 137, Noida, Gautam Buddh Nagar, Uttar Pradesh, 201305
GFG App on Play Store GFG App on App Store
Advertise with us
  • Company
  • About Us
  • Legal
  • Privacy Policy
  • In Media
  • Contact Us
  • Advertise with us
  • GFG Corporate Solution
  • Placement Training Program
  • Languages
  • Python
  • Java
  • C++
  • PHP
  • GoLang
  • SQL
  • R Language
  • Android Tutorial
  • Tutorials Archive
  • DSA
  • Data Structures
  • Algorithms
  • DSA for Beginners
  • Basic DSA Problems
  • DSA Roadmap
  • Top 100 DSA Interview Problems
  • DSA Roadmap by Sandeep Jain
  • All Cheat Sheets
  • Data Science & ML
  • Data Science With Python
  • Data Science For Beginner
  • Machine Learning
  • ML Maths
  • Data Visualisation
  • Pandas
  • NumPy
  • NLP
  • Deep Learning
  • Web Technologies
  • HTML
  • CSS
  • JavaScript
  • TypeScript
  • ReactJS
  • NextJS
  • Bootstrap
  • Web Design
  • Python Tutorial
  • Python Programming Examples
  • Python Projects
  • Python Tkinter
  • Python Web Scraping
  • OpenCV Tutorial
  • Python Interview Question
  • Django
  • Computer Science
  • Operating Systems
  • Computer Network
  • Database Management System
  • Software Engineering
  • Digital Logic Design
  • Engineering Maths
  • Software Development
  • Software Testing
  • DevOps
  • Git
  • Linux
  • AWS
  • Docker
  • Kubernetes
  • Azure
  • GCP
  • DevOps Roadmap
  • System Design
  • High Level Design
  • Low Level Design
  • UML Diagrams
  • Interview Guide
  • Design Patterns
  • OOAD
  • System Design Bootcamp
  • Interview Questions
  • Inteview Preparation
  • Competitive Programming
  • Top DS or Algo for CP
  • Company-Wise Recruitment Process
  • Company-Wise Preparation
  • Aptitude Preparation
  • Puzzles
  • School Subjects
  • Mathematics
  • Physics
  • Chemistry
  • Biology
  • Social Science
  • English Grammar
  • Commerce
  • World GK
  • GeeksforGeeks Videos
  • DSA
  • Python
  • Java
  • C++
  • Web Development
  • Data Science
  • CS Subjects
@GeeksforGeeks, Sanchhaya Education Private Limited, All rights reserved
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have read and understood our Cookie Policy & Privacy Policy
Lightbox
Improvement
Suggest Changes
Help us improve. Share your suggestions to enhance the article. Contribute your expertise and make a difference in the GeeksforGeeks portal.
geeksforgeeks-suggest-icon
Create Improvement
Enhance the article with your expertise. Contribute to the GeeksforGeeks community and help create better learning resources for all.
geeksforgeeks-improvement-icon
Suggest Changes
min 4 words, max Words Limit:1000

Thank You!

Your suggestions are valuable to us.

What kind of Experience do you want to share?

Interview Experiences
Admission Experiences
Career Journeys
Work Experiences
Campus Experiences
Competitive Exam Experiences