SAP Cloud Platform Java Development – Part 2

SAP Cloud Platform Java Development – Part 2

Access SAP HANA in the NEO environment

In the first part of the blog series I introduced you to Java development based on the Spring framework. Using a simple RestController the basics were shown.

What would an application be without access to a database. In the SAP Cloud context it is obvious that SAP HANA is used as database. In this part of the blog series I show you how to access the database in the NEO environment.

JNDI – The yellow pages

The HANA database is accessed via JNDI, in my opinion one of the most exciting technologies in the Java environment. With a JNDI lookup resources managed by the runtime environment can be loaded. The idea behind it is very simple – the runtime environment takes care of instantiating the required classes and makes them available to the application.

In order for the application presented here to support JNDI, some activities must be carried out. In the first step it is necessary to create a subdirectory called webapp in the directory src > main and in this subdirectory a subdirectory called WEB-INF. In the directory src > main > webapp > WEB-INF it is only necessary to create a file named web.xml.

Folderstructure

Folderstructure für web.xml

Those resources that are managed by JNDI must be listed in the web.xml file as resource reference (resource-ref). The name (res-ref-name) under which the resource is addressed and the underlying Java class (res-type) must be defined.

 

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns="http://java.sun.com/xml/ns/javaee"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd"
 version="3.0">
   <resource-ref>
     <res-ref-name>jdbc/DefaultDB</res-ref-name>
     <res-type>javax.sql.DataSource</res-type>
   </resource-ref>
</web-app>

Declare Spring Data Dependency

The first step has already been completed. Since we do not implement the access to the database with SQL commands ourselves, but use Spring Data, we have to define a corresponding dependency in pom.xml.

<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>

Since Spring Data supports a variety of databases, it is necessary to tell the runtime environment which database to use. This is done using the application.properties file. If this file does not yet exist, it must be created in the directory src > main > resources. The fully qualified class name of the HANA driver is com.sap.db.jdbc.Driver

spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.HANAColumnStoreDialect
spring.jpa.properties.hibernate.connection.pool_size = 10
spring.datasource.driverClassName=com.sap.db.jdbc.Driver

@Configuration

Spring @Configuration annotation is part of the Spring Core Framework. The Spring annotation indicates that the class has @Bean definition methods. This allows the Spring container to process the class and generate Spring Beans at runtime that can be used in the application.

For use in the Neo Stack, the data source must be loaded using JNDI. A corresponding class must be created for this. The following code snippet shows the complete class.

Neo Datasource Configuration

package at.clouddna.demo;

import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
import org.springframework.jdbc.datasource.lookup.JndiDataSourceLookup;

import javax.sql.DataSource;
import java.sql.SQLException;

@Configuration
@Profile({"neo"})
public class NeoConfig 
{	
	@Bean(destroyMethod="")
	public DataSource jndiDataSource() throws IllegalArgumentException, SQLException
	{
		JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
		DataSource ds = dataSourceLookup.getDataSource("java:comp/env/jdbc/DefaultDB");
		return ds;
	}
}
The JNDI lookup aims at the name java:comp/env/jdbc/DefaultDB. The prefix java:comp/env/ is always the same in the SAP Cloud Platform. The name jdbc/DefaultDB defined behind it corresponds to the res-ref-name in the web.xml

Development of an entity class

The use of Spring-Data allows a very efficient development of the persistence layer. Spring-Data is based on the Hibernate Framework. As soon as we create a class and assign the annotation @Entity to it, a corresponding table is created in the underlying database. In the following code snippet I show you a simple user class.

User.java

package at.clouddna.demo.model;

import javax.persistence.*;

@Entity
public class User {

    @Column(nullable = false)
    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    protected Long id;

    private String firstname;
    private String lastname;

    public Long getId() {
        return this.id;
    }

    public void setId(Long id) {
        this.id = id;
    }

    public void setFirstname(String firstname) {
        this.firstname = firstname;
    }

    public String getFirstname() {
        return this.firstname;
    }

    public void setLastname(String lastname) {
        this.lastname = lastname;
    }

    public String getLastname() {
        return this.lastname;
    }
}

CRUD Methods

The great thing about Spring-Data is the out-of-the-box availability of CRUD methods. All you have to do is create an interface that inherits from the JpaRespository. This is shown for the user entity in the following code snippet.

package at.clouddna.demo.repository;

import at.clouddna.demo.model.User;

public interface IUserRepository extends JpaRepository<User, Long> {

}
The repository can now be used directly in the controller by accessing it via autowiring. However, my company refrains from this. We always create an associated DTO (Data Transfer Object) for each entity class and additionally create a service class that is provided with the @Service Annotation, which encapsulates the use of the repository. The service class can be injected in the controller via the @Autowired Annotation.

Of course I will show you how it works.

ServiceBase Classs and Model Mapper

The mapping of the Entity class to the DTO and vice versa is not done manually but with ModelMapper. The modelmapper must be included in pom.xml as a dependency.

<dependency>
   <groupId>org.modelmapper</groupId>
   <artifactId>modelmapper</artifactId>
   <version>2.3.5</version>
</dependency>

ServiceBase.java

package at.clouddna.codegenerator.service.da;

import org.modelmapper.ModelMapper;
import org.modelmapper.convention.MatchingStrategies;

import java.io.FileOutputStream;
import java.io.IOException;
import java.util.Collection;
import java.util.List;
import java.util.stream.Collectors;

public abstract class ServiceBase {

    private ModelMapper modelMapper;

    public ServiceBase(){
        this.modelMapper = new ModelMapper();
        this.modelMapper.getConfiguration().setMatchingStrategy(MatchingStrategies.STANDARD);
    }

    public <D, T> D map(T entity, Class<D> outClass) {
        return modelMapper.map(entity, outClass);
    }

    public <D, T> List<D> mapAll(Collection<T> entityList, Class<D> outCLass) {
        return entityList.stream()
                .map(entity -> map(entity, outCLass))
                .collect(Collectors.toList());
    }

    protected void writeToFile(String fileName, String content) throws IOException {
        FileOutputStream outputStream = new FileOutputStream(fileName);
        byte[] strToBytes = content.getBytes();
        outputStream.write(strToBytes);
        outputStream.close();
    }

    protected void writeToFile(String fileName, byte[] content) throws IOException {
        FileOutputStream outputStream = new FileOutputStream(fileName);
        outputStream.write(content);
        outputStream.close();
    }
}

Service Class

The Service class inherits from the ServiceBase class and encapsulates access to the database. The following code snippet shows the UserService class. It is important that the class is provided with the annotation @Service. This allows it to be used in the controller via autowiring.

UserService.java

package at.clouddna.demo.service;

import at.clouddna.demo.dto.UserDto;
import at.clouddna.demo.model.User;
import at.clouddna.demo.respository.IUserRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

import java.util.List;
import java.util.Optional;

@Service
public class UserService extends ServiceBase {

    @Autowired
    private IUserRepository repository;

    public UserDto create(UserDto userDto) {
        return map(repository.save(map(userDto, User.class)), UserDto.class);
    }

    public UserDto update(UserDto userDto) {
        return map(repository.save(map(userDto, User.class)), UserDto.class);
    }

    public boolean delete(Long id) {
        repository.deleteById(id);
        return true;
    }

    public UserDto findById(Long id) {
        Optional<User> userOptional = repository.findById(id);
        if(!userOptional.isPresent()) {
            return null;
        }
        return map(userOptional.get(), UserDto.class);
    }

    public List<UserDto> findAll() {
        return mapAll(repository.findAll(), UserDto.class);
    }
}

RestController

Finally, I will show you how the previously developed service in the RestController can be fully used for all CRUD methods. You will be surprised how easy it is!

UserController.java

package at.clouddna.demo.controller;

import at.clouddna.demo.dto.UserDto;
import at.clouddna.demo.model.User;
import at.clouddna.demo.service.UserService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/user")
public class UserController {

    @Autowired
    private UserService userService;

    @GetMapping
    public ResponseEntity getAll() {
        return ResponseEntity.ok(userService.findAll());
    }

    @GetMapping("/{id}")
    public ResponseEntity getById(@PathVariable("id") Long id) {
        return ResponseEntity.ok(userService.findById(id));
    }

    @PostMapping
    public ResponseEntity create(@RequestBody UserDto body) {
        return ResponseEntity.ok(userService.create(body));
    }

    @PutMapping("/{id}")
    public ResponseEntity update(@PathVariable("id") Long id,
                                 @RequestBody UserDto body) {
        body.setId(id);
        return ResponseEntity.ok(userService.update(body));
    }

    @DeleteMapping("/{id}")
    public ResponseEntity delete(@PathVariable("id") Long id) {
        return ResponseEntity.ok(userService.delete(id));
    }
}

Conclusion

I hope that in this part I have made Java development in the SAP Cloud Platform appealing to you. As you have hopefully realized, this is not witchcraft and not space science. I would like to give you one more tip – already during the planning of the project, make sure that everything is clearly structured and that you define your own package for entity, DTO, repository, service and controller.

SAP Cloud Platform Java Development

SAP Cloud Platform Java Development

In this blog series I will show you how the Java development for the SAP Cloud Plaform is done based on Spring Boot. Spring Boot is the defacto standard framework for developing Java applications in the Cloud / Software-as-a-Service. There are many resources available on the Internet for development with Spring Boot, but the SAP Cloud Platform specifics mostly fell by the wayside. SAP offers an SDK in the neo stack of the SAP Cloud Platform that enables the use of SAP’s own functionalities in Java. This blog focuses on the general developer tasks in the development of Java applications. The structure of the pom.xml is shown and a simple rest controller is implemented. I will show the following contents in future blogs:
  • Access HANA DB via JNDI
  • Access Destinations via JNDI
  • Access Tenant Information via JNDI
  • Access User Information

pom.xml Structure

The pom.xml contains all relevant information required to build the application.

Parent

After we set to Spring Boot this must be defined as parent.
<parent>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-parent</artifactId>
  <version>2.3.1.RELEASE</version>
  <relativePath />
</parent>

Properties

The Java version and the version of the SAP Cloud SDK are defined in the properties.
<properties>
  <java.version>1.8</java.version>
  <sap.cloud.sdk.version>3.107.18</sap.cloud.sdk.version>
</properties>

Dependencies

In the dependencies the dependencies are defined. Among other things, references are made to SpringBoot Starter Web and also to the SAP Cloud SDK. With the SAP Cloud SDK it is important that the link is to the version defined in the properties and that the scope is set to the value provided. This means that this dependency is not packed in the WAR file and is considered as provided to the server.
<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-web</artifactId> 
</dependency>

<dependency> 
  <groupId>com.sap.cloud</groupId>
  <artifactId>neo-java-web-api</artifactId>
  <version>${sap.cloud.sdk.version}</version>
  <scope>provided</scope>
</dependency>

<dependency>
  <groupId>com.google.code.gson</groupId>
  <artifactId>gson</artifactId>
  <version>2.8.6</version>
</dependency>

Profile

In this example, a separate profile is created for the deployment on the neo stack. This profile defines that certain dependencies are provided by the server and are not embedded in the WAR file. Otherwise the start of the application would fail.
<profile>
  <id>neo</id>
  <activation>
    <activeByDefault>true</activeByDefault>
  </activation>
  <properties>
    <packaging.type>war</packaging.type>
  </properties> 
  <dependencies> 
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>ch.qos.logback</groupId>
      <artifactId>logback-classic</artifactId>
      <scope>provided</scope>
    </dependency>  
  </dependencies>
</profile>

Build Plugins

Finally, the Spring Boot Maven plugin must be defined in the build.
<plugins>
  <plugin>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-maven-plugin</artifactId>
  </plugin>
</plugins>

Build the application for the SAP Cloud Platform

The build of the application can be executed via the CommandLine with the following command mvn clean package -Pneo In order to successfully launch the application from within Tomcat, the configure method must be redefined from the SpringBootServletInitializer. This can be done either directly in the main class of the application or in a separate class. Personally I prefer to use my own class at this point.
package at.clouddna.demo;

import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;

public class ServletInitializer extends SpringBootServletInitializer {
  @Override
  protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
    return application.sources(DocmgmtApplication.class);
  }
}

Implementation of a simple rest controller

So that a functional test can be carried out, I always implement a ping method in a separate controller.
package at.clouddna.docmgmt.controller; 

import org.slf4j.Logger;
import org.slf4j.LoggerFactory; 
import org.springframework.http.ResponseEntity; 
import org.springframework.web.bind.annotation.GetMapping; 
import org.springframework.web.bind.annotation.RequestMapping; 
import org.springframework.web.bind.annotation.RestController; 

@RestController 
@RequestMapping({ "/ping" }) 
public class PingController { 
  private static final Logger logger = LoggerFactory.getLogger(PingController.class); 

  @GetMapping
  public ResponseEntity<?> ping() { 
    logger.debug("ping called"); 
    return ResponseEntity.ok("Hello World"); 
  }
}
This means we are already finished with a simple application. When deploying via the SAP Cloud Platform Cockpit, make sure that the Java Web Tomcat 8 Runtime is selected and that the profile is transferred as a JVM parameter as follows
-Dspring.profiles.active=neo
The ping controller can be tested directly from the browser, since the corresponding method is called via an HTTP GET request.