Thursday, September 1, 2016

[Algorithm] Counting Bits

Counting Bits


Problem Statement:

Given a non negative integer number num. For every numbers i in the range 0 ≤ i ≤ num calculate the number of 1's in their binary representation and return them as an array.

Example:
For num = 5 you should return [0,1,1,2,1,2].


Solution: Dynamic Programming


Time Complexity : O(n)
Space Complexity: O(n)

This problem can be solved easily with O(32n) by counting the bits with brutal force. But a better way is to use dynamic programming with the following observation:

Any number i can be expressed as:

(1)  i = j + 2^k, where j < 2^k

The reason we pick 2^k is that it only will have one bit which is 1 and all the rest is 0. Since j < 2^k, the number j will never have 1 bits overlapping with the number 2^k. Thus we can conclude this:

(2) bits_count[i] = bits_count[j] + bits_count[2^k], where i = j + 2^k

With (1) and (2) we can easily have the following code implementation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
class Solution {
public:
    vector<int> countBits(int num) {
        if (num == 0) return {0};
        
        vector<int> result(num+1);
        result[0] = 0;
        result[1] = 1;
        
        int base = 1;
        for (int i=2; i<=num; i++) {
            if (i == base*2) {
                result[i] = 1;
                base *= 2;
            } else {
                result[i] = result[base] + result[i - base];
            }
        }
        
        return result;
    }
};


Sunday, August 21, 2016

[Algorithm] Patching Array

Patching Array


Problem Statment:

Given a sorted positive integer array nums and an integer n, add/patch elements to the array such that any number in range [1, n] inclusive can be formed by the sum of some elements in the array. Return the minimum number of patches required.

Solution: Greedy + Math


This problem seems to be astonishingly hard at first glance. If you're ever thinking about enumerating all possible combination, time complexity will easily explode. Indeed this cannot be solved by brutal force.

Now let's consider what it means by this statement:

"any number in range [1, n] inclusive can be formed by the sum of some elements in the array"

This means a requirement of the problem statement of course, but there's more to it, it is also telling us one property. Suppose we have an array which contains combinations that sum up to each and every element in range [1, n], then it also means if we add an element with value equal to n+1, we could extend the range to [1, n+n+1]. Why is that? See following, I use C[i] to represent one combination that can add up to number i.

For range [1, n] we have:

C[1] = 1, C[2] = 2, C[3] = 3, ..., C[i] = i, ... C[n] = n

Now if we add one more element n+1 to this sequence we will have:

C[1] = 1, C[2] = 2, C[3] = 3, ..., C[i] = i, ... C[n] = n, C[0] + n+1 = n+1, C[1] + n+1 = n+2, ..., C[i] + n+1 = i+n+1, ... C[n] + n+1 = n+n+1

This property is the key to solve this problem. Now let's call the range we have reached as consolidated_range. At the beginning we have consolidated range as 0, which also implies we don't have any elements in the array yet. As we go along the input array, we are adding elements into the final sequence and as a result we are extending our consolidated_range. Then we could have the following two possibilities:

1. The consolidated_range can connect with one element nums[i] in the array (connect means consolidated_range+1 >= nums[i]). In this case, the consolidated_range we be extended by nums[i] according to the property we described. Also we don't consider this as a patch because element is in the original array.

2. The consolidated_range cannot connect with the element nums[i] or there's no element left. This would imply a patch and the optimal patch we can use is to add an number with value of consolidated_range+1, because every value below consolidated_range+1 has combination that sums up to it and according to the property we extend our range to consolidated_range + consolidated_range + 1.

We will keep extending the consolidated range until it hits the target range. Code implementation as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
class Solution {
public:
    int minPatches(vector<int>& nums, int n) {
        long long consolidated = 0;
        int len = nums.size();
        int i=0;
        int count=0;
        
        /* consolidated means we can sum up to every value in [1, consolidated] 
        *  1) When nums[i] is below or equal consolidated or equal to consolidated+1
        *     we can extend the consolidated by nums[i], because if we can reach nums[i]
        *     without nums[i] itself, we can also reach consolidated + nums[i]
        *
        *
        *  2) When consolidated is below nums[i], then we need an optimal number which is
        *     equal to consolidated+1, because we can then use consolidated+1 as the base and
        *     use any combination to reach (consolidated+consolidated+1)
        *
        */
        while (consolidated < n) {
            if (i < len && nums[i] <= consolidated+1) {
                consolidated += nums[i++];
            } else {
                consolidated += (consolidated+1);
                count++;
            }
        }
        
        return count;
    }
};




Saturday, August 20, 2016

[DevNote] Uploading Multipart File with Spring MVC

[DevNote] Uploading Multipart File with Spring MVC


In this article we are going to examine how to setup multipart file uploading and handling in Spring MVC.

It is common to submit images within a form submission, Spring MVC provides MultipartFile to represent submitted multiple part content.

Keywords: Spring MVC, MultipartFile

Here's the Maven dependency for some modules we need for MultipartFile handling:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-webmvc</artifactId>
   <version>${springframework.version}</version>
  </dependency>

  <!-- Servlet -->
  <dependency>
   <groupId>javax.servlet</groupId>
   <artifactId>javax.servlet-api</artifactId>
   <version>3.1.0</version>
  </dependency>

  <dependency>
   <groupId>javax.servlet</groupId>
   <artifactId>jstl</artifactId>
   <version>1.2</version>
  </dependency>
  
        <dependency>
            <groupId>javax.validation</groupId>
            <artifactId>validation-api</artifactId>
            <version>1.1.0.Final</version>
        </dependency>

Most of needed dependencies are already in the core Spring MVC artifact and the only thing that is extra is javax.validation, we are going to use the @Valid annotation to validate the data user uploaded.


Configuration


To resolve the multiple part content in HTTP request, we need a resolver for MultipartFile, Spring MVC provides MultipartResolver interface which is the strategy for resolving MultipartFile and we can use the CommonMultipartResolver as the implementation we what to use. So the first thing we need is to create a bean method for it in our application configuration.

1
2
3
4
5
6
    @Bean
    public CommonsMultipartResolver multipartResolver() {
        CommonsMultipartResolver resolver=new CommonsMultipartResolver();
        resolver.setDefaultEncoding("utf-8");
        return resolver;
    }

The next we need to configure our MultipartFile with our servlet, so that the servlet would know what to expect from the request. There are two ways of doing this configuration, one is by @MultipartConfig annotation and the other is creating a MultipartConfigElement then pass it to servlet registration. 

I used the second option:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
    @Override
    protected void customizeRegistration(ServletRegistration.Dynamic registration) {
        registration.setMultipartConfig(getMultipartConfigElement());
    }
 
 private MultipartConfigElement getMultipartConfigElement() {
  MultipartConfigElement multipartConfigElement = new MultipartConfigElement(FileDataConfig.TEMPROOT,
    FileDataConfig.MAX_FILE_SIZE, FileDataConfig.MAX_REQUEST_SIZE, FileDataConfig.FILE_SIZE_THRESHOLD);
  return multipartConfigElement;
 }

Final Paths, Temp paths


In the MultipartConfig, we configured four things:

Temporary upload location
Maximum file size allowed

Maximum request size
File size threshold

One thing in this registration we need to pay attention is the first argument of the constructor of MultipartConfigElement, which is the temporary directory the uploaded file will ended locating, so one should not mix this with the final upload location. 

Final path is never configured with the resolver, it's the business logic who determine the final upload location.

File Validation


For the content in the incoming HTTP request, usually we will be putting constraint on the content. This can be achieved by implementing a container for the MultipartFile and a a Validator then bind the content with validator to perform validation.

A file container can be defined as simple as follows:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
public class FileContainer {

 private MultipartFile file;
 
 public MultipartFile getFile() {
  return file;
 }
 
 public void setFile(MultipartFile file) {
  this.file = file;
 }
}

Then our file validator will be:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
@Component
public class FileValidator implements Validator {

 @Override
 public boolean supports(Class<?> obj) {
  return FileContainer.class.isAssignableFrom(obj);
 }

 @Override
 public void validate(Object obj, Errors error) {
  FileContainer file = (FileContainer) obj;
  if (file.getFile() != null) {
   if (file.getFile().getSize() == 0) {
    error.reject("file", "missing.file");
   } else if (!file.getFile().getContentType().equals("application/pdf")) {
    /* Allowed only PDF, which has MIME type of application/pdf,
     * application/x-pdf is experimental type not the official standard
     * */
    error.reject("file", "unsupported type");
   }
  }
 }

}

An instance of FileValidator will be bound to the FileContainer, so we need to make sure the FileValidator is simple enough and has no internal state, because this will be injected as a Singleton.

Binding of the two looks like this:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
@Controller
public class FileController {

 @Autowired
 private FileValidator fileValidator;
 
 @InitBinder("fileContainer")
 protected void initFileContainerBinder(WebDataBinder binder) {
  binder.setValidator(fileValidator);
 }
}

Controller


Let's examine how we use them altogether in the controller:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
 @RequestMapping(value = "/uploadFile", method = RequestMethod.POST)
 public String DemoUploadSingleFile(HttpServletRequest request, @Valid FileContainer fileContainer,
   BindingResult binding) {
  
  if (binding.hasErrors()) {
   return "hasError";
  } else {
   MultipartFile file = fileContainer.getFile();
   String filename = file.getOriginalFilename();
   try {
    Files.copy(file.getInputStream(), Paths.get(FileDataConfig.STASHROOT, filename));
    return "success";
   } catch (IOException | RuntimeException e) {
    return "hasError";
   }
  }
 }

Since we've already coded most of the validation and content handling in other places, the controller just simply need to check if the MultipartFile is valid and then save the file to the final location.

[Algorithm] Rearrange String k Distance Apart


Rearrange String k Distance Apart



Problem Statement:


Given a non-empty string str and an integer k, rearrange the string such that the same characters are at least distance k from each other.

All input strings are given in lowercase letters. If it is not possible to rearrange the string, return an empty string"".


Solution : Sorting + Greedy


This one is actually very good problem, sharing my thinking process a bit.

(Idea A) Initially I was thinking naively as counting all the occurrences of each char and inserting by its alphabetical order, keep track of the previous added character count and compare it with k, if the previous added count is smaller than k, meaning the distance between the first character must be smaller than k. This logic is flawed because I didn't consider this:

"abbcc", k = 2

The condition of previous added count greater than k will not guarantee us that same character will be k distance apart, simply because the head of alphabetical sequence is no longer available.

(Idea B) Then I began to think that we probably should use the frequency of each character as the order instead of alphabetical order. For example:

"abbcc" is:

a : 1
b : 2
c : 2

Rendering an order of "b->c->a" for insertion, because we are inserting every possible characters during each iteration so that this order will be kept throughout the process. But this one is still flawed in this case:

"aaabc", k = 2

The second and third "a" will automatically be false no matter what. The above approaches are bad because I was trying to maximize the distance possible between two characters, since the total length of string is fixed, meaning this direction of greedy will make some substring fall short of the constraint.

So this suggests that the greedy should be taken in another direction, which is minimum distance possible which satisfy the constraint. Actually when we think about this way, we shed some light upon one important condition:

To make sure that same char is k distance apart, we just need to ensure that in each window of size k, there's no duplicate letter. And to make sure across each window we can use the order that (Idea B) was using. But instead of fixing the order upfront, we need to update this order after each window of insertion. This will guarantee that each letter inside the window will be kept at the largest distance to the next window. (If the distance is smaller than k, this suggests rearrangement is impossible).

Code implementation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
class Solution {
public:
    typedef pair<int, char> Node;
    string rearrangeString(string str, int k) {
        if (k == 0) return str;
        
        vector<Node> letters(26);
        for (char c : str) {
            letters[c - 'a'].first++;
            letters[c - 'a'].second = c;
        }
        reorderLetters(letters);
        
        string result;
        int depletion = 0;
        while (!letters.empty()) {
            if (k > letters.size() && (++depletion) > 1) return "";
            
            for (int i=0; i<k && i<letters.size(); i++) {
                result += letters[i].second;
                letters[i].first--;
            }
            reorderLetters(letters);
        }
        
        return result;
    }
    
    inline void reorderLetters(vector<Node>& letters) {
        if (letters.empty()) return;
        std::sort(letters.begin(), letters.end(), [](const Node& lhs, const Node& rhs) {
            if (lhs.first == rhs.first) return lhs.second < rhs.second;
            return lhs.first > rhs.first;
        });
        
        while (!letters.empty() && letters.back().first == 0) {
            letters.pop_back();
        }
    }
};




Friday, August 19, 2016

[Algorithm] Maximum Size Subarray Sum Equals k

Maximum Size Subarray Sum Equals k


Problem Statement:

Given an array nums and a target value k, find the maximum length of a subarray that sums to k. If there isn't one, return 0 instead.

Example 1:

Given nums = [1, -1, 5, -2, 3], k = 3,
return 4. (because the subarray [1, -1, 5, -2] sums to 3 and is the longest)

Example 2:

Given nums = [-2, -1, 2, 1], k = 1,
return 2. (because the subarray [-1, 2] sums to 1 and is the longest)

Follow Up:
Can you do it in O(n) time?



Solution: Prefix Sum + Hash Table


This problem can be solved using prefix sums. The original question really boils down to this equation: prefix_sum[i] - prefix_sum[j] = k (where i > j), and we need to find the maximum gap of i and j. We can use a hash table keyed by prefix sums and the value should be the index. There might be multiple prefix_sum[i] that is of equal value and we just need to keep the lowest. And we put a key-value pair of (0, -1) to cover cases where the entire prefix_sum[i] is a possible subarray which meets requirements.

Once the hash table is built we can do a O(n) search on each prefix_sum[i] to find the prefix_sum[j] = prefix_sum[i] - k and keep the max length.

Code implementation (version 1):


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
class Solution {
public:
    int maxSubArrayLen(vector<int>& nums, int k) {
        int n = nums.size();
        unordered_map<int, int> prefix;
        prefix[0] = -1;
        
        int sum = 0;
        for (int i=0; i<n; i++) {
            sum += nums[i];
            if (prefix.find(sum) == prefix.end()) {
                prefix[sum] = i;
            }
            nums[i] = sum;
        }
        
        int len = 0;
        for (int i=n-1; i>=0; i--) {
            int current = nums[i];
            if (prefix[sum] == i) prefix.erase(sum);
            
            int target = current - k;
            auto it = prefix.find(target);
            if (it != prefix.end()) {
                len = max(len, i - it->second);
            }
        }
        
        return len;
    }
};


This implementation does some redundant work since the way it maintains the relationship of i > j is by scanning backwards and removing prefix_sum[i] if it's the lowest index for this prefix sum. The removing can be avoided if we search while we are inserting new prefix sums. We can do a search for prefix_sum[j] before we insert prefix_sum[i].

Code implementation (version 2):


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
class Solution {
public:
    int maxSubArrayLen(vector<int>& nums, int k) {
        int n = nums.size();
        unordered_map<int, int> prefix;
        prefix[0] = -1;
        
        int len = 0;
        int sum = 0;
        for (int i=0; i<n; i++) {
            sum += nums[i];
            
            int target = sum - k;
            auto it = prefix.find(target);
            if (it != prefix.end()) {
                len = max(len, i - it->second);
            }
            
            if (prefix.find(sum) == prefix.end()) {
                prefix[sum] = i;
            }
        }
        
        return len;
    }
};



Tuesday, August 9, 2016

[DevNote] JWT Token Base Authentication in Spring MVC

JWT Token Base Authentication in Spring MVC


Keywords: Spring MVC, Spring Security, Jwt, MongoDB

Session based authentication requires server to keep session information of client logins which is making server not stateless and raises problems of scalability. Token based authentication has several advantages since server is freed from all the bookkeeping for sessions. Moreover token based authentication is immune to cross site request forgery(CSRF) by design and can be shared within the HTTP request header to other domains. 

This article stated many more advantages of token based authentication with comparison to cookie based authentication:

https://auth0.com/blog/angularjs-authentication-with-cookies-vs-token/

So how we make our Spring MVC backend serve the token based authentication? Here I have my own example shared with JSON web token which is "an open, industry standard RFC 7519 method for representing claims securely between two parties" according to its official site. And JWT has multiple libraries available in many languages, for Java we are going to use this JWT library as our tool to generate, sign and validate our JWT token.

All the source code in this blog is available from Github: https://github.com/yhuang69/PoormanPaperSearch


First thing first, the Jwt library we are using:

1
2
3
4
5
6
  <!-- JWT -->
  <dependency>
   <groupId>io.jsonwebtoken</groupId>
   <artifactId>jjwt</artifactId>
   <version>0.4</version>
  </dependency>

In Spring, we can usually assign the token handling task to one of our customized filter, this middleware can perform tasks like extracting and parse/decoding the token and verify token. In the following example, I just created a JwtTokenAuthenticationFilter extending the GenericFilterBean.


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
package com.hackinghorse.mockPaperSearch.secure;

import java.io.IOException;

import javax.servlet.FilterChain;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;

import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
import org.springframework.security.core.context.SecurityContextHolder;
import org.springframework.security.core.userdetails.UserDetails;
import org.springframework.security.core.userdetails.UserDetailsService;
import org.springframework.security.web.authentication.WebAuthenticationDetailsSource;
import org.springframework.web.filter.GenericFilterBean;

import com.hackinghorse.mockPaperSearch.secure.service.SimpleTokenStrategy;

import io.jsonwebtoken.Claims;

public class JwtTokenAuthenticationFilter extends GenericFilterBean {
 
 static Logger log = Logger.getLogger(JwtTokenAuthenticationFilter.class.getName());
 
 @Autowired
 private SimpleTokenStrategy jwtStrategy;
 
 
 @Autowired
 private UserDetailsService userDetailsService;
 
 @Override
 public void doFilter(ServletRequest req, ServletResponse res, FilterChain filterChain)
   throws IOException, ServletException {
  final HttpServletRequest request = (HttpServletRequest) req;
  
  String tokenStr = jwtStrategy.getTokenFromRequest(request);
  
  /* 
   * TODO: Change this filter to authentication provider and JwtAuthenticationToken strategy 
   * so that we can throw an AuthenticationException directly.
   * 
   * Use AuthenticationProvider and Authentication Interfaces instead!
   * 
   */
  if (tokenStr != null && SecurityContextHolder.getContext().getAuthentication() == null) {
   try {
    final Claims claims = jwtStrategy.parseToken(tokenStr);
    String username = claims.getSubject();
    UserDetails userDetails = userDetailsService.loadUserByUsername(username);
    if (jwtStrategy.validateToken(claims, userDetails)) {
     UsernamePasswordAuthenticationToken authentication = new UsernamePasswordAuthenticationToken(
       userDetails, null, userDetails.getAuthorities());
     authentication.setDetails(new WebAuthenticationDetailsSource().buildDetails(request));
     SecurityContextHolder.getContext().setAuthentication(authentication);
    }
   } catch (Exception e) {
    log.debug("mockPaperSearchDev: JWT token is invalid");
   }
  }
  
  filterChain.doFilter(req, res);
 }

}

In the doFilter we first find the token from the incoming HttpServletRequest then we use a helper class to parse the token into claims. The Claims object which contains the payload of the JWT token will have all information we need for this user. In my token the subject ("sub" field) is the username, and it also contains token creation time, expiration time, and roles granted to this user. Upon parsing any malformed token will cause the parse() function to throw exception.

If the token is found, the next step is to check out the user document from our database and verify if the user does exist and the token has not expired. If token is valid, we will create an UsernamePasswordAuthenticationToken object and set to our security context so that our AuthenticationManager can work on this AuthenticationToken.

In fact, I think it's better to use AuthenticationManager interfaces and let AuthenticationManager delegate an authentication to AuthenticationProvider and perform the authentication so we can throw an AuthenticationException directly when we have a malformed or missing token. (I may follow up with some example on that)

Now here's how to setup the filter in our filter chain middleware:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
@Configuration
@EnableWebSecurity
@ComponentScan(basePackages="com.hackinghorse.mockPaperSearch")
public class AuthenConfiguration extends WebSecurityConfigurerAdapter {

 @Autowired
 private UnauthorizedEntryPoint unauthorizedHandler;

 @Autowired
 private UserDetailsService userDetailsService;
 
 @Bean
 public JwtTokenAuthenticationFilter authenticationTokenFilterBean() {
  return new JwtTokenAuthenticationFilter();
 }
 
    @Override
    public void configure(AuthenticationManagerBuilder authenticationManagerBuilder) throws Exception {
        authenticationManagerBuilder
                .userDetailsService(this.userDetailsService);
    }
    
    @Bean(name="authenticationManager")
    @Override
    public AuthenticationManager authenticationManagerBean() throws Exception {
        return super.authenticationManagerBean();
    }
    
 @Override
 protected void configure(HttpSecurity http) throws Exception {
  http.csrf().disable()
  .exceptionHandling().authenticationEntryPoint(unauthorizedHandler).and()
  .sessionManagement().sessionCreationPolicy(SessionCreationPolicy.STATELESS).and()
        .authorizeRequests().antMatchers("/login", "/signUp", "/static/**", "/search", "/search/**").permitAll()
        .antMatchers("/user**", "/user/**", "/login/refresh", "/uploadFile", "/userFiles/**").access("hasRole('SUBSCRIBER')")
        .antMatchers("/staff**", "/staff/**", "/signUpStaff").access("hasRole('STAFF')")
        .antMatchers("/db**", "/db/**").access("hasRole('ADMIN')");
  
  http.addFilterBefore(authenticationTokenFilterBean(), UsernamePasswordAuthenticationFilter.class);
  
        // disable page caching
        http.headers().cacheControl();
 }
}

We first need to configure our UserDetailsService to our AuthenticationManager. The we also need to add our JwtTokenAuthenticationFilter to our filter chain inside the configure() function. In configure() we also make sure our server is stateless by using SessionCreationPolicy.STATELESS and disable CSRF since Jwt tokens are immune to that. Also we need add permission to our URLs to restrict them to permitted users.

If user is unauthorized, we need to send back a 401 response, I did this by registering an AuthenticationEntryPoint of the following, so all 401 request will be handled by this class.


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
@Component
public class UnauthorizedEntryPoint implements AuthenticationEntryPoint {

 @Override
 public void commence(HttpServletRequest request, HttpServletResponse response, AuthenticationException exception)
   throws IOException, ServletException {
  response.sendError(HttpServletResponse.SC_UNAUTHORIZED, "Unauthorized");
 }

}

So that's all the additional setup we need for token based authentication in Spring, every URL request which requires authentication we be filtered by our JwtTokenAuthenticationFilter.

Of course we need database support for the User models, there's no magic to it, so I will skip that part.

For logging, we need a controller to handle the user credential sent from client and generate valid token and return it back to client. Here's the very straightforward example:


1
2
3
4
5
6
7
 @RequestMapping(value="/login", method=RequestMethod.POST)
 public ResponseEntity<TokenResponse> authenticateUser(@RequestBody TokenRequest tokenRequest) 
   throws AuthenticationException {
  String token = authenticateAndSignToken(tokenRequest.getUsername(), tokenRequest.getPassword());
  subscriberDao.update(tokenRequest.getUsername(), "lastLogin", TimeString.now());
  return new ResponseEntity<TokenResponse>(new TokenResponse(token), HttpStatus.OK);
 }

All the Jwt token related functions are packaged in the ITokenStrategy classes, and here's some code snippets for those functions (full code is available on Github repo of this project).

Token creation and signing:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
 public String createToken(final UserDetails user) {
  Map<String, Object> claims = new HashMap<>();
  claims.put(claimUseranme, user.getUsername());
  claims.put(claimAdmin, user.getAuthorities().contains(new SimpleGrantedAuthority("ROLE_" + UserRole.ADMIN.getRole())));
  claims.put(claimCreated, new Date(System.currentTimeMillis()));
  return generateToken(claims);
 }
 
    protected String generateToken(Map<String, Object> claims) {
        return Jwts.builder()
                .setClaims(claims)
                .setExpiration(generateExpirationDate())
                .signWith(SignatureAlgorithm.HS512, secretKey)
                .compact();
    }


Token parsing:


1
2
3
4
5
6
7
8
 public String getTokenFromRequest(HttpServletRequest request) {
  return request.getHeader(headerKey);
 }
 
 public Claims parseToken(final String tokenStr) 
   throws ExpiredJwtException, MalformedJwtException, SignatureException {
  return Jwts.parser().setSigningKey(secretKey).parseClaimsJws(tokenStr).getBody();
 }