What are the Spark Debug Tools?

The Spark Debug Tools currently consists of only one tool, the DebugScreen:

Spark Debug Tool

Debug Screen Features:

  • Highlights exception-throwing code and provides easy stacktrace navigation
  • Controls for Googling exception and copying stacktrace
  • Panel showing important information:
    • Headers
    • Route parameters
    • Query paramters
    • Request attributes
    • Sessions attributes
    • Cookies
    • Environment details

Setting up your project

The Spark Debug Tool work best if your project follows a standard Maven directory layout, so step one should be to create a Spark Maven project (→ Tutorial).

Enabling the DebugScreen

To enable the debug tool, add the following dependency to your pom-file:


    <dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-debug-tools</artifactId>
    <version>0.5</version>
</dependency>

Then, add the line enableDebugScreen() to your Spark application. This code shows a full example:


    public class DebugScreenExample {
    public static void main(String[] args) {
        port(4567);
        get("*", (req, res) -> {
            throw new Exception("Exceptions everywhere!");
        });
        // Add this line to your project to enable the debug screen
        enableDebugScreen(); 
    }
}

The debug tool uses the exception(exceptionClass, exceptionHandler) method from Spark, which is used for mapping Exceptions to handlers. The debug screen maps Exception.class to a DebugScreen instance, which allows the debug screen to catch all uncaught exceptions and render a nice error page.

Beta status, please report bugs

The tool is currently in beta status, so please report bugs in the comments below, or create an issue on GitHub.

Suggestions welcome

If you have an idea for something that would be nice to include in the Debug Screen, or an idea for another debug tool, please write your suggestion in the comments, or create an issue on GitHub.