You are not logged in.

Announcement

*** NOTICE: forum.openstreetmap.org is being retired. Please request a category for your community in the new ones as soon as possible using this process, which will allow you to propose your community moderators.
Please create new topics on the new site at community.openstreetmap.org. We expect the migration of data will take a few weeks, you can follow its progress here.***

#1 2008-08-01 21:29:38

don martin
Member
Registered: 2008-06-27
Posts: 10

acquiring planet data

(1) I'm using osm2pgsql_latest (command line) attempting to parse planet-080507.osm.bz2 (also tried with planet-080423.osm.bz2) but get to node 211610 (node 211730 with planet-080423.osm.bz2) and trip an error 'allocating nodes'. would appear to be a size ralated issue? any ideas?
(2) I'm using those older downloads because using IE (click link - save/cancel dialog) any newer files result in an much smaller file download (184m -> 15m) than is indicated (~4g). Tried right-click save target with same result on newer files. Something else I can use (FTP)?

Offline

#2 2008-08-02 09:20:33

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

At first glance this looks like you run out of disk space...


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

#3 2008-08-02 09:28:39

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

No - got 500g disk space

Offline

#4 2008-08-02 09:32:18

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

saw some reference in the mapnik wiki to a wget 2 g limit?

Offline

#5 2008-08-02 10:29:59

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

don martin wrote:

saw some reference in the mapnik wiki to a wget 2 g limit?

Nah, countless people use wget daily to download the latest planet dump so at least an reasonably up-to-date wget would certainly go beyond 2 GB.

Other options:
- Is your computer overclocked? I used a slightly overclocked machine once and experienced planet file corruption on download.
- Are you using Windows 98 with FAT16/32?


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

#6 2008-08-02 18:47:45

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

Thanks for the push-back Lambertus - we're 8-9 hours apart geospatially speaking - hence my less than prompt response.

Primary box is a fresh install of win 2003 svr (sdt-sp2) on intel duo-core 2g ram 500g NTFS raid1 storage so hardware shouldn't be a constraint. Just standing it up as a map server and have sub-set of OSM and sharpmap/openlayers running on it OK.

Having the same issue (error-out ~same node) on two other machines - an older HP win 2003 server and a Dell XP - though storage could be a factor on those.

Could items 1&2 (original post) be related as in resulting downloaded files? I'm seeing planet-080507.osm.bz2 as 4,117,547KB and planet-080423.osm.bz2 as 4,025,070KB BZ2 files downloaded from and OSM mirror.

Offline

#7 2008-08-02 20:35:48

emj
Member
From: .se (59.3N17.99E) 0735969076
Registered: 2006-06-18
Posts: 949

Re: acquiring planet data

don martin wrote:

allocating nodes

Well the error message can be comming from Postgres or the osm2pgsql tool. In this case it seems to be the tool that is at fault. Apparently you need more memory, or you should switch to using postgres as storage for the nodes while you are importing.

Am I making sense? :-/

About downloading, there are several tools that help with downloading and if you are using wget you can use the -c option to make it continue a failed download. I don't think that will correct a corrupted file, with garbage in the end, though..

Offline

#8 2008-08-02 20:45:08

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

Thanks for the push emj - I'm not really following I think -
  do I need more than 2g ram to process planet files with osm2pgsql?
  I'm pushing into postgres/postgis - is there a flag to use with osm2pgsql to write as each node/way is it is read (or batch write)?

I was using IE to download the planet files - is FTP a possibility?

Offline

#9 2008-08-02 22:07:48

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

I'm sorry I have no clues left sad

Our NL tileserver is quite modest in 'size' and I must say it copes quite well (could do with a faster I/O subsystem though)... So 'only' 2 GB should not be a problem. Our tileserver worked fine with just 1 GB before.


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

#10 2008-08-02 23:09:17

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

Thanks Lambertus -

re 'NL tileserver' - I'm looking for data representing the other side of the globe though (US-Alaska in particular) that I can get into a database. I'll look for other sources...

I would hope that perhaps another forum member might have some thoughts given I'm starting with a clean install (os/postgres/postgis etc...) and therefore all components are at default values and therefore I shouldn't be the firest guy to climb this hill.

We'll see. If I come upon a solution elsewhere I'll share it back here.

Offline

#11 2008-08-03 00:31:18

emj
Member
From: .se (59.3N17.99E) 0735969076
Registered: 2006-06-18
Posts: 949

Re: acquiring planet data

I don't know the options, I don't use the tool. But  the error you get should only happen when you don't have enough memory.

from my link:

RAM:

/* Implements the mid-layer processing for osm2pgsql
 * using several arrays in RAM. This is fastest if you
 * have sufficient RAM+Swap.
 *
 * This layer stores data read in from the planet.osm file
 * and is then read by the backend processing code to
 * emit the final geometry-enabled output formats
*/

Postgres:

/* Implements the mid-layer processing for osm2pgsql
 * using several PostgreSQL tables
 *
 * This layer stores data read in from the planet.osm file
 * and is then read by the backend processing code to
 * emit the final geometry-enabled output formats
*/

Offline

#12 2008-08-03 01:32:10

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

thanks emj - on the primary box this script is consuming ~50% of available ram (so < 1g) and no file swapping (have it running now) so don't think it's a memory problem.
Don't see any configuration issues with postgres/postgis that would preclude that many records being inserted but will try shutting off vaccum and give it a try.

Offline

#13 2008-08-03 10:05:23

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

Well, the US and Canada data is many times bigger then the NL data, so Emj could have a point here. But like I said before, there are lots of people downloading/importing the entire planet file into their PostGIS databases. Please join the dev mailinglist and send your question there as there are usually more developers and hard-core users present.


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

#14 2008-08-03 10:13:49

don martin
Member
Registered: 2008-06-27
Posts: 10

Re: acquiring planet data

will do - thanks Lambertus

Offline

#15 2008-10-01 23:33:52

Vloris
Member
Registered: 2008-10-01
Posts: 3

Re: acquiring planet data

don martin wrote:

(2) I'm using those older downloads because using IE (click link - save/cancel dialog) any newer files result in an much smaller file download (184m -> 15m) than is indicated (~4g). Tried right-click save target with same result on newer files. Something else I can use (FTP)?

You could try using firefox, some (maybe all) versions of Internet Explorer can't handle downloads larger then 2g. Maybe that's your problem?

Offline

#16 2008-10-02 08:56:19

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

Vloris wrote:
don martin wrote:

(2) I'm using those older downloads because using IE (click link - save/cancel dialog) any newer files result in an much smaller file download (184m -> 15m) than is indicated (~4g). Tried right-click save target with same result on newer files. Something else I can use (FTP)?

You could try using firefox, some (maybe all) versions of Internet Explorer can't handle downloads larger then 2g. Maybe that's your problem?

That's a good suggestion. Most browser' standard download tools aren't the best implementations. Have a proper download tool to download the planet file...(see the link in the previous sentence). Most users working with planet files use Linux hence use a good tool: wget or curl.


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

#17 2008-10-02 10:18:41

emj
Member
From: .se (59.3N17.99E) 0735969076
Registered: 2006-06-18
Posts: 949

Re: acquiring planet data

Lambertus wrote:

browser' standard download tools aren't the best implementations. Have a proper download tool to download the planet file.

That's understating it, they should be banned completely... :-) "wget -c" to continue failed transfers is very useful... Though I've actually managed to get a corrupted planet.osm, using "wget -c" or the curl equivalent.

Last edited by emj (2008-10-02 10:19:10)

Offline

#18 2008-10-02 10:53:42

Lambertus
Inactive
From: Apeldoorn (NL)
Registered: 2007-03-17
Posts: 3,269
Website

Re: acquiring planet data

emj wrote:

Though I've actually managed to get a corrupted planet.osm, using "wget -c" or the curl equivalent.

I had that too but it was a result of my computer being overclocked which caused the network interface hardware to flip some bits occasionally.


Mapping tools: Garmin GPSmap 60CSx, Giant Terrago 2002

Offline

Board footer

Powered by FluxBB