Wednesday, June 25, 2014

Draft Entities with Hibernate Validator

Let's say we want to fire different validation rules, depending on whether a record is a draft or not.

The Entity:

@Entity 
public class BlogPost {

  @Id
  private Long id;

  //always enforce this constraint:
  @NotBlank
  private String title;

  // only enforce this constraint if our validation group is "Publish"
  @NotBlank( groups = { Publish.class } )
  private String body;

  ...getters and setters...
}

//the "Publish" validation group:
public interface Publish extends Default {};


The HTML form (Spring MVC):

...
<form:form>
  <form:label path="title">Title <form:errors path="title"/></form:label>
  <form:input path="title"/>

  <form:label path="body">Body <form:errors path="body"/></form:label>
  <form:textarea path="body"></form:textarea>

  <button type="submit" name="action" value="save-draft">Save Draft</button>
  <button type="submit" name="action" value="publish">Publish</button>

</form:form>
...

The Spring MVC Controller:

@Controller
@RequestMapping("/blog-posts")
public class BlogPostController {
 
  /* this method is called if we hit the Save Draft button. It validates the blogPost with the Default group (ignoring the constraint on Body) */

  @RequestMapping( method = RequestMethod.POST, params="action=save-draft" )
  String saveDraft( @Validated BlogPost blogPost, BindingResult result) {
    ...
  }

  /* this method is called if we hit the Publish button. It validates the blogPost with the Publish group, checking the @NotBlank constraint on Body */

  @RequestMapping( method = RequestMethod.POST, params="action=publish" )
  String publish( @Validated({Publish.class}) BlogPost blogPost, BindingResult result)) { 
    ...
  }

}

Wednesday, June 18, 2014

How to import Geonames data into PostgreSQL on Windows

  1. Download http://download.geonames.org/export/dump/allCountries.zip (251MB) to %ProgramData%\geonames\, and unzip it.
  2. In PostgreSQL, create a table to import the data into:
create table geonames (
  geoname_id int primary key,
  name text,
  ascii_name text,
  alternate_names text,
  latitude numeric(8,6),
  longitude numeric(9,6),
  feature_class char(1),
  feature_code text,
  country_code char(2),
  cc2 TEXT,
  admin1_code text,
  admin2_code text,
  admin3_code text,
  admin4_code text,
  population bigint,
  elevation int,
  dem text,
  timezone text,
  modification_date date
);

Run the COPY command:

copy geonames
from 'c:/programdata/geonames/allCountries/allCountries.txt'
with (
  format text,
  null ''
)

On my laptop it took ~65 seconds. There were 9,071,443 rows.

Let's say we just want to view the ISO 3166-1 countries. Lets's create an index on the feature_code column so we can quickly filter on it (warning, this can take a while):

create index on geonames (feature_code);

And now a view to show only the ISO countries:

create view iso_countries as 
select
*
from geonames 
where 
(
feature_code IN ( 'PCL', 'PCLD', 'PCLF', 'PCLI', 'PCLIX', 'PCLS' )
or geoname_id in (2461445, 5880801, 607072, 6697173) --western sahara, american samoa, svalbar and jan mayen, antarctica
)
and geoname_id <> 831053 --kosovo
order by
country_code;

Sunday, June 8, 2014

Pulling Exchange Rates From A JSON Web API with PostgreSQL

  1. This probably isn't a good idea on a busy server, as it might block
  2. Requires PL/PythonU
  3. Requires an App ID from openexchangerates.org, which is free (US base currency only)

A Function To Make An HTTP Request

create or replace function http_get(url text) returns text strict volatile language plpythonu AS $$

  import urllib2;
  
  return urllib2.urlopen(url).read();

$$;

A Function to Return a Monetary Amount in USD


CREATE TABLE exchange_rates_usd
(
  counter_currency char(3),
  close_date date,
  midmarket_rate numeric(19,10) NOT NULL,
  PRIMARY KEY (counter_currency, close_date)
);

create function usd(amount numeric(19,4), _counter_currency char(3), _close_date date = current_date) returns numeric(19,4) language plpgsql strict volatile as $$
declare
 result numeric(19,4);
 response json;
begin

 select 
  midmarket_rate 
 from exchange_rates_usd 
 where 
  counter_currency = _counter_currency 
  and close_date = _close_date 
 into result;

 if found then return amount / result; end if;

 response = http_get( format('https://openexchangerates.org/api/historical/%s.json?app_id=%s', _close_date, current_setting('my.OPENEXCHANGERATES_APP_ID') ) );

 insert into exchange_rates_usd (counter_currency, close_date, midmarket_rate) 
 select 
  rates.key::char(3)
  , _close_date
  , rates.value::text::numeric(19,10) 
 from json_each(response->'rates') rates;

 select 
  midmarket_rate 
 from exchange_rates_usd 
 where 
  counter_currency = _counter_currency 
  and close_date = _close_date 
 into result;

 return amount / result;

end $$;

The Output

/* you can set this value in postgresql.conf as well: */
set my.OPENEXCHANGERATES_APP_ID = 'whatever your app id is';

select usd(100, 'CAD', '2014-08-14');

91.66