Sunday, October 16, 2016

Spring Boot error pages, angular ui-router, and grunt integration

Spring Boot has a useful feature for providing a global error page based by status code by placing an html file named with the status code in your static resources.
http://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-error-handling-custom-error-pages

One of the problems with this however, is if you want to maintain a separate javascript build for your webclient and are using angular and ui-router as a single page webapp, maintaining your sources will become a pain.  For instance, if you want menus to continue to work, how are you going to accurately change the state when a 404 page's location is to "/non/existent/weird/path".  Without using a template rendering library on the server side, this becomes a problem.

There are a lot of ways this could be handled, you could jump through a lot of hoops rewriting all your urls.  You could utilize Spring ResourceTransformers.  You could ensure *everything* for headers/footers/and the like are all pulled in as templates (or god forbid copy them and maintain them in two different locations).

Because the project I'm working on already utilized the processhtml grunt plugin, I decided to just add additional targets for each of my error pages.

First, add the targets to your grunt configuration:
// 'paths' is setup earlier in my grunt file
processhtml: {
mainapp: {
options: {
strip: true,
data: {
year: new Date().getFullYear(),
localCssPath: paths.minCssName,
localJsPath: paths.minJsName
}
},
files: {
'pathToBuildDir/apppath/index.html': ['app/index.html']
}
},
error400: {
options: {
strip: true,
data: {
year: new Date().getFullYear(),
localCssPath: paths.minCssName,
localJsPath: paths.minJsName
}
},
files: {
'pathToBuildDir/error/400.html': ['app/index.html']
}
},
error404: {
options: {
strip: true,
data: {
year: new Date().getFullYear(),
localCssPath: paths.minCssName,
localJsPath: paths.minJsName
}
},
files: {
'pathToBuildDir/error/400.html': ['app/index.html']
}
}
}
It goes without saying to adjust your build output accordingly, just ensure that the error directory ends up in a Spring Boot recognized static resource location.

Now, let's do some templating within our index.html file like so:
<div class="page-container">
<div class="site-content">
<!-- build:remove:error400,error404 -->
<div ui-view="mainBody"></div>
<!-- /build -->
<!-- build:include:error400 error/400.html --><!-- /build-->
<!-- build:include:error404 error/404.html --><!-- /build-->
</div>
</div>
What we are doing is removing the ui-view div for our main app here when we are building an error page, but pulling in content for our various error statuses.

This gets us most of the way there.  But what if your app is served under a sub directory?  What if the 404 is on some nested path.

I tend to dislike using base hrefs (just an opinion), but we can leverage one here.  Add the following to the head of your index.html source:
<!-- build:include:error404,error404 error/base.html --><!-- /build-->
And the following in your base.html fragment:
<meta name="yourErrorPageTag" content="true" />
<base href="/apppath/" />
Now we will force all links to images/js/css/etc to go back to our main app.

What is the meta tag for?  Well, that is just something I placed in that can be detected during state change for ui-router.  I added the following to my angular app to force changing the window location when a submenu is clicked in my menu bar:
$rootScope.$on('$stateChangeStart', function(evt, to, params){
var isError = $windowProvider.$get().document.querySelector("meta[name='yourErrorPageTag']");
if (isError) {
evt.preventDefault();
$windowProvider.$get().location = "/apppath/index.html" + $windowProvider.$get().location.hash;
}
});
This is kind of a dirty hack.  But now we can maintain our entire app within one source file.  I think this is good enough.

Sunday, March 29, 2015

Automatically syncing up your project reports, javadoc and groovydoc, README.md markdown, and github page with gradle and travis-ci.

I've been working on trying to start a *very* simple open source library, but quickly found that I don't want to spend a bunch of time keeping all my docs and websites for it in sync. I'd much rather make a quick README.md file that describes everything, and let my build system do the rest.

While exploring this, the first thing I came across was looking for a gradle site plugin, and I found a good one at https://bitbucket.org/davidmc24/gradle-site-plugin

Integrating this is simple, and you can place all your web files in your src folder under "site" per standard maven site convention.  But I also wanted to add all my gradle reports, and my doclet outputs as well. Not to mention my README.md file which documents all the usage of the api. Here is a gist of how I found best to do it:

buildscript {
repositories {
maven {
name = 'BintrayJCenter'
url = 'http://jcenter.bintray.com'
}
mavenCentral()
}
dependencies {
classpath 'us.carrclan.david.gradle:gradle-site-plugin:0.2.0'
}
}
apply plugin: 'site'
sourceSets {
readmeSource {
resources {
srcDir "$projectDir"
include 'README.md'
}
}
site {
resources {
srcDir 'build/docs'
srcDir 'build/reports'
source readmeSource.resources
}
}
}
view raw build.gradle hosted with ❤ by GitHub
At this point I was able to make my page template, and also be able to link to any of my doclet pages (and reports if I choose). Next, I needed a simple way to import the README.md into my html template, so I don't have to duplicate all my usage/intro docs. I found some options like http://strapdownjs.com/, however it pretty much will take over your page with it's embedded bootstrap configuration. It also required placing your markdown div directly in your body, and not embedded where you want. I instead opted that doing this manually wouldn't be hard by using the marked library with highlight.js. And came up with the following gist that replaces any element (in my case an empty div) with an id of README:

<script type="text/javascript">
var req = new XMLHttpRequest();
req.onload = function() {
var markdownString = this.responseText;
marked.setOptions({
highlight: function (code, lang) {
if (lang == undefined){
return code;
}
return hljs.highlight(lang, code).value;
}
});
var readmeNode = document.getElementById('README');
readmeNode.innerHTML = marked(markdownString);
};
req.open('GET', 'README.md', false);
req.send();
</script>
view raw index.html hosted with ❤ by GitHub
But I still didn't want to have to copy my site output manually and commit to the gh-pages repo. If I just commit my changes, it should just automatically publish right? To bridge the gap between the maven site publishing and github pages, I leveraged a bash script in my travis-ci build. Below is the script I wrote. Note, I'm dumping all output from the actual git push to /dev/null to protect exposing the github api key in case of an error.

#!/bin/bash
cp -Rvf build/resources/site gh-pages
cd gh-pages
git init
git config user.name "travis-ci"
git config user.email "travis@travis-ci.org"
git add .
git commit -m "Publishing site from Travis CI build $TRAVIS_BUILD_NUMBER"
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1
echo "Published site to gh-pages. See http://aweigold.github.io/lemming"
The last step then is to put in a hook into your .travis.yml file. For context I'm showing my entire travis file, your build will change obviously. The important part is the call to the gradle buildSite task provided by the gradle plugin, the call to the bash script, and the environment variables. It's also very important to embed your github api key into a secure parameter, this is checked in publicly you know ;-)

language: java
after_success:
- "./gradlew jacocoTestReport coveralls"
- "./gradlew groovydoc buildSite"
- ".travis/deploy_ghpages.sh"
env:
global:
- GH_REF: github.com/aweigold/lemming.git
- secure: abcAB.....
view raw .travis.yml hosted with ❤ by GitHub
For an example, check out my project page at http://aweigold.github.io/lemming/ and my source at https://github.com/aweigold/lemming.

Thursday, January 22, 2015

Quickly get an audit of your Spring Security mappings

Today I had a junior developer come to me in a panic. He needed to provide a full report of all the security roles required for every endpoint in our services, to verify against requirements. His (valiant) attempts to grep the codebase were producing unreadable results, and we had an unexpected deadline of "now!"

Luckily, I knew that all of our request mappings are configured via annotations, and all the roles were also defined by annotations directly on the request mapping. So I added a random classpath scanning library I found, (https://sites.google.com/site/javacornproject/corn-cps) to the classpath of each service in IntelliJ, and fired up a Groovy console using the service's classpath. The following gist shows what I came up with:

//Add net.sf.corn:corn-cps:1.0.1 to your classpath
import net.sf.corn.cps.*
import org.springframework.stereotype.Controller
import org.springframework.web.bind.annotation.RequestMapping
import org.springframework.security.access.prepost.PreAuthorize
def classes = CPScanner.scanClasses(new PackageNameFilter("com.company.*"), new ClassFilter().appendAnnotation(Controller))
classes.each { clazz ->
clazz.getDeclaredMethods().each { method ->
RequestMapping mapping = method.getAnnotation(RequestMapping)
if (mapping != null){
PreAuthorize auth = method.getAnnotation(PreAuthorize)
if (auth != null){
println "URL: " + mapping.value()
println "PERM: " + auth.value()
println "------"
}
}
}
}
return "DONE!"
Obviously, you'll need to update this if you do any configuration in XML, have Security annotations further down the stack, or use RequestMapping annotations at the class level. But the point is, the Groovy console in IntelliJ is your friend for quick one-off projects.

Saturday, September 20, 2014

Using your grunt project and/or bower components in your Spring application during development

I've been playing around with grunt, bower, and yo lately. It's pretty simple creating a build step to copy the dist/ output into your webjar, or war for production, but I wanted an easy way to bring in my javascript into my Spring app during development. Nothing bugs me more than having to perform a copy or rebuild step just to see my changes in the application during dev.

So I came up with this solution. Assuming your IDE is running your run configuration in your source root, you can create Spring resource handlers relative to that directory. The gist below shows.
@Configuration
@Profile("dev")
public class DevWebResourceAdapter extends WebMvcConfigurerAdapter {
@Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/**")
.addResourceLocations("file:./submodule-web/app/");
registry.addResourceHandler("/bower_components/**")
.addResourceLocations("file:./submodule-web/bower_components/");
}
}

Monday, August 12, 2013

IntelliJ Idea, JUnit running, and "non Make" build steps.

IntelliJ Idea provides some really awesome JUnit runners that allow you to right click test methods, classes, and packages, and quickly run/debug them.

The default JUnit configuration runs Idea's "Make" prior to running your test. This however becomes incredibly painful when you have separate build steps, such as generated sources (protobuf), bytecode manipulation (jibx), etc.  Although you are able to change default settings for the JUnit runner, these default settings are stored in the project workspace file, which shouldn't be checked into source control when working with other teams.

IntelliJ will detect default run configurations placed in the ipr, however, it will immediately remove it and place it in the workspace file... again, another conflict for source control.

This is where the Gradle idea plugin can come in.  Since all of the custom build steps should come in after running the "testClasses" target (pending your are running a Java project... it's up to you to figure out other project types), one can add a Run configuration to run testClasses, and then make the default JUnit configuration depend on that run configuration.

apply plugin: 'idea'
idea {
project {
ipr.withXml {
// Create a task to run gradle testClasses, which we will subsequiently bind to Default JUnit runner to
def runConfigComp = it.node.appendNode('component')
def runOpts = [
"default": "false",
"name": "testClasses for JUnit",
"type": "GroovyScriptRunConfiguration",
"factoryName": "Groovy"
]
def testClassesConfigurationNode = runConfigComp.appendNode('configuration', runOpts)
testClassesConfigurationNode.appendNode("setting", [name: "path", value: 'file://$PROJECT_DIR$/build.gradle'])
testClassesConfigurationNode.appendNode("setting", [name: "params", value: "testClasses"])
testClassesConfigurationNode.appendNode("setting", [name: "workDir", value: 'file://$PROJECT_DIR$'])
testClassesConfigurationNode.appendNode("setting", [name: "debug", value: "false"])
testClassesConfigurationNode.appendNode("RunnerSettings", [RunnerId: 'Run'])
testClassesConfigurationNode.appendNode("ConfigurationWrapper", [RunnerId: 'Run'])
testClassesConfigurationNode.appendNode("method").appendNode("option", [name: "Make", enabled: "false"])
// Updates default JUnit settings, adding UseSplitVerifier, and swapping Make with testClasses
runOpts = [
"default": "true",
"name": "JUnit",
"type": "JUnit",
"factoryName": "JUnit"
]
def jUnitConfigurationNode = runConfigComp.appendNode('configuration', runOpts)
jUnitConfigurationNode.appendNode("extension", [
name: "coverage",
enabled: "false",
merge: "false",
runner: "idea"
])
jUnitConfigurationNode.appendNode("option", [name: "TEST_OBJECT", value: "class"])
jUnitConfigurationNode.appendNode("option", [name: "VM_PARAMETERS", value: "-ea -XX:-UseSplitVerifier -XX:MaxPermSize=2048m"])
jUnitConfigurationNode.appendNode("option", [name: "WORKING_DIRECTORY", value: 'file://$MODULE_DIR$'])
jUnitConfigurationNode.appendNode("option", [name: "PASS_PARENT_ENVS", value: "true"])
jUnitConfigurationNode.appendNode("option", [name: "TEST_SEARCH_SCOPE"]).appendNode("value", [defaultName: "moduleWithDependencies"])
def jUnitMethodNode = jUnitConfigurationNode.appendNode("method")
jUnitMethodNode.appendNode("option", [name: "Make", enabled: "false"])
jUnitMethodNode.appendNode("option", [name: "RunConfigurationTask", enabled: "true", run_configuration_name: "testClasses for JUnit", run_configuration_type: "GroovyScriptRunConfiguration"])
}
}
}
view raw build.gradle hosted with ❤ by GitHub

Sunday, October 7, 2012

Windows Authentication for service running on Windows Server 2008 connecting to SQL Server running on Windows Server 2003

When running Tomcat on Windows, it is useful to run it as a service using a service account that has permissions to your SQL database so that you do not have to keep your credentials in a config file that can be compromised.

I ran into a problem where my service was unable to authenticate against a SQL instance running on Windows Server 2003 from a system running Windows Server 2008. It didn't seem to matter if I was running jTDS or the Microsoft provided JDBC drivers.

On the client side, I would immediately get I/O errors saying the DB server closed the connection. On the server side, I would see the following errors in the Event Log produced by MSSQL (catagory: Logon):

"Length specified in network packet payload did not match number of bytes read; the connection has been closed. Please contact the vendor of the client library."

When running a vanilla installation of Windows Server 2003, the server will not be able to support NTLMv2, where on a vanilla installation of Windows Server 2008, it will not drop down to NTLM.

The best fix I have found was by changing the security policy on the client to drop down in authentication. (There is a forum post here that also references the fix, but the post is specific on another application)

Go to Local Security Policy (or set it on your domain), and under "Security Options", you will find "Network security: LAN Manager authentication level" with a default value of "Not Defined". Change it to "Send LM & NTLM - use NTLMv2 session security if negotiated".

Click apply, and restart your service, and you will have database connectivity via Single Sign On Windows Authentication.

Tuesday, January 10, 2012

Using a MultpartRequestResolver with Spring and using Spring Security concurrently

Update: This will not work with Spring 3.1. This is due to the ServletRequestMethodArgumentResolver being added by default prior to custom argument resolvers in a private method in the RequestMappingHandlerAdapter (getDefaultArgumentResolvers).

When using Spring Security, the CommonsMultipartResolver will not work. Why? Because the MultipartHttpServletRequest will be wrapped in a SecurityContextHolderAwareRequestWrapper, and will not be matched.
Of course, we don't want to fall back to just taking an HttpServletRequest as a parameter in our RequestMapping and parsing it out, we need to work smarter than that!
The best solution I could come up with is registering a custom WebArgumentResolver (below). But any readers out there have a better solution, please share!
Resolver:
public class SecurityContextWrappedMultipartRequestArgumentResolver implements WebArgumentResolver {
private final CommonsMultipartResolver commonsMultipartResolver;
public SecurityContextWrappedMultipartRequestArgumentResolver(){
this.commonsMultipartResolver = new CommonsMultipartResolver();
}
@Override
public Object resolveArgument(MethodParameter methodParameter, NativeWebRequest webRequest) throws Exception {
if (MultipartHttpServletRequest.class.equals(methodParameter.getParameterType())) {
Object object = webRequest.getNativeRequest();
if (! (object instanceof SecurityContextHolderAwareRequestWrapper)) {
return UNRESOLVED;
}
HttpServletRequest request = (HttpServletRequest) object;
if (!ServletFileUpload.isMultipartContent(request)) {
return UNRESOLVED;
}
SecurityContextHolderAwareRequestWrapper requestWrapper = (SecurityContextHolderAwareRequestWrapper) request;
return commonsMultipartResolver.resolveMultipart(requestWrapper);
}
return UNRESOLVED;
}
}
//Then wire up the HandlerAdapter (your config may vary)
@Bean
public HandlerAdapter handlerAdapter() {
final AnnotationMethodHandlerAdapter handlerAdapter = new AnnotationMethodHandlerAdapter();
handlerAdapter.setCustomArgumentResolver(new SecurityContextWrappedMultipartRequestArgumentResolver());
handlerAdapter.setAlwaysUseFullPath(true);
List&lt;HttpMessageConverter&lt;?&gt;&gt; converterList = new ArrayList&lt;HttpMessageConverter&lt;?&gt;&gt;();
converterList.addAll(Arrays.asList(handlerAdapter.getMessageConverters()));
converterList.add(jibxHttpMessageConverter);
converterList.add(gsonHttpMessageConverter);
converterList.add(csvLocalizationConverter);
converterList.add(protobufLocalizationConverter);
handlerAdapter.setMessageConverters(converterList.toArray(new HttpMessageConverter[converterList.size()]));
return handlerAdapter;
}