Dear reader, if you got someting out of my blog and if it helped you, it would be great if you supported me by clicking on the Google ads-banner at the right. This will not cost you anything but time but will generate a little bit of revenue for my time invested ;)
I'm as well happy if you LIKE or 1+ it and about comments that help to improve my articles.

Tuesday, December 18, 2012

Mass Data Processing in SAP ABAP Environment - Update Function Modules


For an SAP ABAP project I am doing currently, I had the challenge to loop through a database table with 50+ million rows, do some processing with each line and after successful processing I wanted to write some data to another table in the database.

Note: throughout this post I use the notation "<some text>"

First Attempt: Standard Approach

My first attempt looked roughly like this:

SELECT * FROM <DB table1 (50+ million)> into <wa1>. 
process data and put processed data into internal table <itab1>. 
LOOP AT <itab1> INTO <wa2>. 
INSERT INTO <DB table2> FROM <wa2>.

Unfortunately, due to the size of the <DB table1> this ran into a timeout error in dialog processing as well as in batch / background processing.

Second Attempt: Write each single line within SELECT ...ENDSELECT

My second attempt looked like this:

SELECT * FROM <DB table1 (50+ million)> into <wa1>. 
process data and put processed data into internal table <wa2>. 
INSERT INTO <DB table2> FROM <wa2>.
Unfortunately, the COMMIT WORK within the SELECT ... ENDSELECT causes a shortdump, as the COMMIT WORK closes the database connection which is required for the next SELECT step.

Third Attempt: Using the Update Function Module

After a bit of reading, I thought out that the problems above could be resolved by encapsulating the processing logic above into an update function module and put the COMMIT WORK outside of the SELECT ... ENDSELECT. For the update module I used the option "Start Immediately". PLease note as well that no COMMIT must be called inside the update module.
Additionally, I used the option PACKAGE SIZE n for the SELECT to do packaged (bunched) processing. In pseudocoding this looks like this:
SELECT * FROM <large DB table> INTO TABLE <itab> PACKAGE SIZE <package size>. 
CALL FUNCTION 'Update_Module_With_Processing_Logic' IN UPDATE TASKEXPORTINGitab_in = <itab>. 

Unfortunately, this approach ran into issues as well. I created too many update tasks and the update queue crashed before the COMMIT WORK was issued.

Fourth and Final Attempt: Putting it All Together

Finally, after some more reading and a helpful blog I found a working solution. This solution uses

  • Database cursor and FETCH instead of SELECT
  • Packaged processing via PACKAGE SIZE n
  • Decoupling of the DB data fetch from COMMIT by using the option WITH HOLD
  • Decoupling of the work processes via asynchronous RFC calls
  • Queued DB updates via update function module

The corresponding main program logic looks like this:

OPEN CURSOR WITH HOLD <the cursor> FOR SELECT * FROM <large DB table>. 
FETCH NEXT CURSOR <the cursor> INTO TABLE <itab> PACKAGE SIZE <package size>. 
IF sy-subrc <> 0.EXIT.ENDIF. 
CLOSE CURSOR <the cursor>.
The RFC-module logic looks like this:
CALL FUNCTION 'Update_Module_With_Processing_Logic' IN UPDATE TASKEXPORTINGitab_in = <itab>.  


Furthermore I changed the update module type to "Start Delayed".

By this approach I was able to do the processing of more than 50 million lines. Maybe it is noteworthy that the processing of the update queue took about 1 day (depending on system). The update queue can be monitored with the help of transaction SM13.

I hope you can get something out of this and you enjoyed reading, all the best

Thursday, October 18, 2012

PS3 Repeatedly Cannot Connect to WLAN


I have a PS3 (CHECHL04 80GB model from 2008 or so) which we are using predominantly for watching movies from the Playstation Store. Up to recently the PS3 was connected via cable to an Apple 1 TB Time Capsule which I used as WLAN router. The Time Capsule itself was connected to a cable modem.
Then, once in a sudden the Time Capsule quit its service... I wrote about this earlier.

Current Setup

I replaced the Time Capsule with a combination of an Apple Airport Express and a Synology DS212j NAS, the cable modem remained the same. Unfortunately, the Airport Express only has one LAN cable socket instead of the 4 like the Time Capsule or an Airport Extreme - which I did not buy due to budget reasons. The one cable socket I reserved for the connection of the DS212j, so the PS3 had to connect to the network via WLAN from now on.

Problems, Problems, Problems - WLAN & PS3 sucks

Everything started without problems - I configured the PS3 network settings (more or less everything on auto mode, the wlan password, ...) and it worked without problems. ... once ...
The next time I switched on the PS3 (I wanted to watch a movie together with my wife..) nothing worked - no connection to the network possible.
What followed was a couple of days nightmare of configuring the network settings again and again and trying all kind of permutations of the network parameters over and over. The wlan was always detected, I verified passwords and IP addresses (automatic and manual), tried the same on the Aiport Express side. Nothing worked...

At Long Last - The Helpful Hint

During my configuration exercises I scanned as well a thousand forums and blogs, without finding a helpful answer right away. In the end one discussion was helpful - unfortunately I cannot find it again to link it here.
In this thread, one author said that there is one speciality of apple wlan devices: if they are working in environments with many wlan routers around (what is the case for me), it switches to wlan channel 13 which causes problems for many client devices - and especially the PS3. The author adviced to manually set the wlan channel of the Airport to a channel lower than 12 or 10 (can't really remember).

I followed this advice and it worked like a fly and ever since. I tried a fixed channel 3 and 6 without problems. I set the PS3 network settings to mostly standard and automatic, no special tricks.

I hope this blog helps other frustrated PS3 network configurators ;)

Monday, October 15, 2012

POSTGIS Spacial Database Installation on a Mac OS X Lion System


Presently I am working on a hobby project of mine where I want to develop a web application with Ruby on Rails which shall visualize data on maps. I essence I want to build a GIS application. After reading a couple of sources in the internet, it became obvious that it would be best to have a spacially enabled database - meaning a relational database with an extension so it can process queries for spacially arranged data. An example would be a simple query where you want to know the gas stations in a certain radius around your current location.
The resulting architecture I want to use is the following:

  • The system database I want to use will be PostgreSQL
  • To spacially enable the database, I want to use the PostGIS addon for PostgreSQL
  • To enable the communication between Rails and the PostGIS server, the activerecord-postgis-adapter is required
  • To be able to write "geospacial" ruby code, the GeoRuby gem is required
  • Furthermore, I want to do the map visualization with OpenLayers to be independent of commercial map services. 

Step 1: Installing the PostgreSQL

So far, I used the (download on for my development (to be honest, my development is currently at chapter 9 of Michael Hartl's excellent tutorial of a micropost app). But I found out that is insufficient for the installation of the PostGIS enhancement, but a full PostgreSQL installation is required (see README.postgis in the PostGIS distribution files).
The latest PostgreSQL EnterpriseDB installation files for the common operation systems are available at

PostgreSQL Installation Problem

During the installation I experienced a problem which has been seen by other users as well (see for instance this discussion thread): the installation starts without problems, runs all through, but hangs in the end with a message

  • "Loading additional SQL modules"

You can only click on "cancel" (what I did once...) which rolls back the whole installation. The discussion above gave the helpful hint:

  • hard stop the installation via the Activity Monitor
  • stop any processes called "postgres" running in the background
  • restart the installation once again with the same directories

This procedure makes the installation run once again and this time ending without any problems.

Step 2: Installation of PostGIS

I found a good description "installing PostGIS on Mac OS X and Ubuntu" by "juniorz" which I followed. Still, I ran into a couple of issues and obstacles which I will describe here.

First installation step ($ brew install postgis)

Firstly, I encountered the following error:
==> ./configure --prefix=/usr/local/Cellar/proj/4.8.0
==> make install
Error: The linking step did not complete successfully
The formula built, but is not symlinked into /usr/local
You can try again using `brew link proj'
The advice to run "brew link proj" actually did not fix the issue, I got another error message:
$ brew link proj
Linking /usr/local/Cellar/proj/4.8.0...
Error: Could not symlink file: /usr/local/Cellar/proj/4.8.0/bin/proj
Target /usr/local/bin/proj already exists. You may need to delete it.
To force the link and delete this file, do:
  brew link -f formula_name
What helped in the end was a forced deletion of proj:
$ brew link -f proj
A second error appeared for the GEOS library:
==> ./configure --prefix=/usr/local/Cellar/geos/3.3.5
==> make install
Error: The linking step did not complete successfully
The formula built, but is not symlinked into /usr/local
You can try again using `brew link geos'
Here the situation was a bit more difficult than for "proj". Neither worked "$ brew link geos" nor "$ brew link -f geos" as the system refused to delete it due to missing privileges. As well "sudo brew link -f geos" failed.
What helped here in the end was to use Finder and to manually delete the directories "geos" located in
 Another issue appeared, when the system tried to download the required json library from
here the download process repeatedly stopped at a couple of downloaded percent. I was able to fix this by downloading the file json-c-0.9.tar.gz from github and to move it to the Homebrew installation directory
With these little tricks I was able (Hooorray!) to install PostGIS.

Second installation step (initdb /usr/local/var/postgres)

No Issues.

Third installation step (pg_ctl -D /usr/local/var/postgres -l /usr/local/var/postgres/server.log start)

When performing the command above, I ran into the following error:
$ pg_ctl -D /usr/local/var/postgres -l /usr/local/var/postgres/server.log startserver startingsh: /usr/local/var/postgres/server.log: No such file or directory
This could be fixed simply by adding the required "postgres" directory to /usr/local/var/ with the help of the Finder.

Fourth installation step (createdb postgis_template)

This installation step needed as well some adjustment. Trying to execute the command as noted by juniorz, I got a password error, because the command was executed for my user who was unavailable in the DB. After a bit of reading I executed the command with the "-U" option, which allows to execute the command with explicitly giving a user name. I used the postgres default user, whic was the only one available in my local tes DB. With the -U option the command looks like this:
createdb -U postgres postgis_template
you will be asked to give the password after hitting enter

... Here I am stuck in the moment...

Monday, October 8, 2012

Google Chrome Crashes with Error "Chrome.exe - Bad Image" due to icudt.dll


I have been using Google Chrome as standard browser on a Windows 7 Enterprise PC with Service Pack 1 ever since. Then, once in a sudden I get an error popup when I want to start Chrome telling me:

<path to user directory>\AppData\Local\Google\Chrome\Application\22.0.1229.79\icudt.dll is either not designed to run on Windows or it contains an error. Try installing the program again using the original installation media or contact your system administrator or the software vendor for support.
 The search for solutions did not yield an immediate remedy, so I want to collect my experiences in a blog.

1. Attempt: Install Chrome over existing installation

First thing I tried was to simply go to the Chrome website and install it over the existing installation. I did not run the installation in administrator mode. And I did not download the installer explicitly to some directory.
Result: Failed

2. Attempt: Uninstall Chrome completely and Install again

By using the Windows control panel I uninstalled Chrome from my system, went to the Google website and re-installed Chrome again.
Result: This approach worked

Sunday, September 23, 2012

Connecting a DS212j NAS Directly to a Computer Without Using a Router


I had the problem that I wanted to transfer about 100 GB of video files from my old Windows XP PC to my new Synology DS212j NAS (Networked Attached Storage). The network environment consisted of the DS as NAS, connected via cable to an airplay express wlan router (connected to a cable modem) and the PC connected via wlan to the network. One complication concerning this set-up was that the PC is quite far away from the airport express, resulting in a quite low bandwidth resulting in transfer times for the data of several ten hours.
My goal was to connect the Disk Station directly via cable to the ethernet port of the PC to benefit from the 1 GBit transfer rates of the cable connection.

One note: if you like this post, I'd be happy if you clicked on the ads banner ;)


Before you start to move your NAS to the PC direct "network", you have to note the network settings of the NAS in the standard network with the router. IN case of the Synology DS212j do the following:

1. Open the DSM - Disk Station Management Console. This can either be done by using the Synology "Assistant" program (needs to be downloaded from the Synology website) or you can open the DSM by typing <ip address of your DS>:5000.  
2. In DSM start the Control Panel and click in section "System" on "Network":

3. In the network settings go to the tab "Network Interface":

Make sure that your network configuration is set to manual. If not done already, choose an appropriate IP address for your DS together with a subnet mask that fits to your wlan router setup. Note the IP address, you will need it to do the proper configuration of the network settings of your PC as soon as the NAS is connected directly to it. Save this configuration. and shut down the NAS.

You have to connect the DS212j - and any NAS having an ethernet port - directly to the PC by using a crossed (this is important) ethernet cable not a standard one. After having realized the hardware set-up (PC + NAS) you have to play a bit with the network settings of the PC.
In the windows control panel start the network setting tool. Quite likely, the network settings of your PC are set to wlan. Disable the wireless connection - and don't forget to switch it on again if you are done with your work ;). Now go to the network settings of the ethernet LAN and there configure the TCP/IP settings. Similar to the NAS set the network address configuration to "manual". Now you have to choose an ip address for your computer which is differing only in the last number from those of the NAS (in my example I choose and choose an appropriate subnet mask ( worked for me). Save your changes.
You can check if your changes work by using the ping command. To use it open a console in Windows (programs --> run --> type in "cmd"), then type in: ping <ip number of NAS> (in my example "ping If the PC and the NAS are connected properly, the system will respond to the ping command with something like:
"64 bytes from icmp_seq=35 ttl=64 time=3.856 ms"

Now you can open the NAS drives like any other. However, it will not be available by standard in the explorer of Windows. You have to open the drive via "programs" --> "run" and typing in the drive/path of the folder you want to open. The only problem I encountered was that the drive could not be opened by its name, but I had to use the ip of the drive. Meaning I had to type:
<ip of NAS>∕<directory> (example: "∕video")

Now it was time to copy my video data and it worked like a fly.

I hope you got something out of this blog and you enjoyed.

Tuesday, September 11, 2012

Model View Control (MVC) Pattern Using C# - Real Life Example

Note: this is a copy of my original article which I published on c-sharpcorner


There are tons of articles around in the web dealing with the implementation of the Model View Controller (MVC) pattern. Nevertheless, I had a hard time finding one usable as a kind of blueprint for a little software project of mine - a Windows Forms application with database access and data table display using the .NET DataGridView. Finally I found Matthew Cochran's article: Introduction to Model View Control (MVC) Pattern using C#
which helped me a lot in getting an MVC up and running.

Still, I always thought that it would have been useful for me to have something closer to real life - and involving a .NET DataGridView. This is what I want to provide in this blog, even if it shows work in progress.

Technical Framework

System: WinXP SP3

.NET version: 4.0

IDE: Visual Studio C# 2010 Express

DB: SQL Server Express 2008 Express
The UI 
Here is what I wanted: a simple UI which enables the user to edit data stored in a database table. The user should be able to view the current data available in the DB, edit values of existing DB table entries, add lines to the data, delete lines and save the data. In the end I want to have a user interface to maintain a couple of database tables. The individual tables are accessible via a dropdown menu in a main window. Each user request from the main window invokes a form for editing data stored in one DB table:
The window called by the user command looks like this:
Model View Control1
The mouse points to what will become the save button.

So far I have implemented only data retreval from the DB, changing data in the UI and saving this data back to the DB. The "delete line" and "add line" features are still missing, but I think the existing functionality is enough to show the principles.
The Model
The model is responsible for any DB interaction and data manipluation. Following Matthew's article I created an interface for the model:
public interface IDBTableEditorModel
 #region Methods
 void addLineToTable();
 void addObserverView(IDBTableEditorView inTeamTypesView);
 void deleteLineFromTable();
 void notifyObserverView();
 DataTable readDataFromDB();
 void writeDataToDB(DataTable in_DataTable);

Note: Don't get irritated by the order of the methods - I just like sorting them alphabetically...
As one might guess by the interface name, I want to use this model in the future to handle DB table data independent of the concrete DB table. This is similar to Matthew's IVehicleModeltogether with the implementation in class Automobile.

The interface is then implemented in an abstract class which serves as a blueprint for the table dependent implementation and which implements the MVC-relevant logic:
public abstract class classDBTableEditorModel : IDBTableEditorModel{
 #region Attributes
 IDBTableEditorView theTeamTypeView;
 string viewType;
 #endregion public classDBTableEditorModel(string inViewType)
 this.viewType = inViewType;
 public void addObserverView(IDBTableEditorView inTeamTypeView)
            theTeamTypeView = inTeamTypeView;
 public void notifyObserverView()
 public void deleteLineFromTable()
 public virtual void writeDataToDB(DataTable inDataTable)
 public virtual DataTable readDataFromDB()
 DataTable lt_DataTable = new DataTable();            //Add code for concrete data retreval here in concrete class return lt_DataTable;

Note: those methods that require table-specific logic have the attribute "virtual".

In this specific example I'm dealing with a DB table called "TeamTypes" and which has 3 columns (TeamTypeID (char(3)), TeamTypeDesc (varchar(100)), TeamTypeNumber (bigint)). Thus I have to create a specific table-dependent implementation of my virtual base class called "classTeamTypesModel", in this class I have to override the two "virtual" methods:
class classTeamTypesModel : classDBTableEditorModel{
 public classTeamTypesModel(string teamTypeName):base(teamTypeName)
 public override DataTable readDataFromDB()
 DataTable lt_TeamTypes = new DataTable();
 //Read data from DB via connector try            {
                lt_TeamTypes = classSqlServerConnector.Instance.TeamTypeReadAll;
 catch { }
 return lt_TeamTypes;
 public override void writeDataToDB(DataTable inDataTable)

Note: the classSqlServerConnector is not part of the MVC, but hosts all the functionality to read/write data from/to the DB and will not be discussed here.

The Controller

Following Matthew's article I created an interface for the controller as well:
 public interface IDBTableEditorController
 DataTable onView_Load();
 void onButtonSave_Click(DataTable inDataTable);
 void onButtonDeleteLine_Click();
 void onButtonAddLine_Click();
 void onCellData_Changed();
 void setView(IDBTableEditorView inTeamTypesView);
 void setModel(IDBTableEditorModel inTeamTypesModel);

The corresponding implementation of this interface looks like this:
    public class classDBTableEditorController : IDBTableEditorController    {
 #region Attributes
 IDBTableEditorModel theTeamTypesModel;
 IDBTableEditorView theTeamTypesView;
 public classDBTableEditorController(IDBTableEditorModel 
        inTeamTypesModel, IDBTableEditorView inTeamTypesView)
 this.theTeamTypesModel = inTeamTypesModel;
 this.theTeamTypesView = inTeamTypesView;

 public classDBTableEditorController()
 public void setModel(IDBTableEditorModel inTeamTypesModel)
 this.theTeamTypesModel = inTeamTypesModel;
 public void setView(IDBTableEditorView inTeamTypesView)
 this.theTeamTypesView = inTeamTypesView;
 public DataTable onView_Load()
 return theTeamTypesModel.readDataFromDB();
 public void onButtonSave_Click(DataTable inDataTable)
 if (theTeamTypesModel != null)
 else            {
                throw new Exception("Error in initializing model to controller: TeamTypes");
 public void onButtonAddLine_Click()

 public void onButtonDeleteLine_Click()

 public void onCellData_Changed()

The View
For the view on ehas to implement the logic in the class code coming with the UI elements. Still, I implemented additionally a view interface:
    public interface IDBTableEditorView    {
 void addObserver(IDBTableEditorController theController);
 void updateUserInterface(IDBTableEditorModel inTeamTypeModel);

The corresponding view implementation looks like this:
public partial class formTeamTypes : Form, IDBTableEditorView{
 private IDBTableEditorController theTeamTypesController = new classDBTableEditorController();
 private IDBTableEditorModel theTeamTypeModel = new classTeamTypesModel("Test");
 public formTeamTypes()
 this.initialize(theTeamTypesController, theTeamTypeModel);
 public void initialize(IDBTableEditorController inTeamTypesController, IDBTableEditorModel inTeamTypesModel)
            if (theTeamTypeModel != null)
            theTeamTypeModel = inTeamTypesModel;
            theTeamTypesController = inTeamTypesController;
 public void addObserver(IDBTableEditorController inTeamTypesController)
        { this.theTeamTypesController = inTeamTypesController;
 public void updateUserInterface(IDBTableEditorModel inTeamTypeModel)
 private void formTeamTypes_Load(object sender, EventArgs e)
     //When Form is called, the data for Team Types are read from DB            dataGridTeamTypes.DataSource = theTeamTypesController.onView_Load();
            dataGridTeamTypes.Columns[0].HeaderText = "Type ID";
            dataGridTeamTypes.Columns[1].HeaderText = "Type Description";
            dataGridTeamTypes.Columns[2].Visible = false;
     //Add a fourth virtual column for a marker flag to mark changed rows            dataGridTeamTypes.Columns.Add("ChangedFlag", "Changed Flag");
            dataGridTeamTypes.Columns[3].Visible = false;
 private void buttonSave_Click(object sender, EventArgs e)
 DataTable lt_RowsForUpdate = new DataTable();
 DataRow ls_DataRow;
            //Build internal table with rows for update            lt_RowsForUpdate.Columns.Add("TeamTypeID");
            lt_RowsForUpdate.Columns.Add("TeamTypeDesc");            lt_RowsForUpdate.Columns.Add("TeamTypeNumber");
            lt_RowsForUpdate.Columns.Add("Changed");            //Check for Changes and write to DB for (int i = 0; i < dataGridTeamTypes.Rows.Count; i++)
 if (dataGridTeamTypes.Rows[i].Cells[3].Value != null && 
                    dataGridTeamTypes.Rows[i].Cells[3].Value.ToString() == "C")
                    ls_DataRow = lt_RowsForUpdate.NewRow();
                    ls_DataRow[0] = dataGridTeamTypes.Rows[i].Cells[0].Value;
                    ls_DataRow[1] = dataGridTeamTypes.Rows[i].Cells[1].Value;
                    ls_DataRow[2] = dataGridTeamTypes.Rows[i].Cells[2].Value; 
        private void dataGridTeamTypes_CellValueChanged(object sender, DataGridViewCellEventArgs e)
     //Mark rows which have been changed            if (e.RowIndex >= 0)
                dataGridTeamTypes.Rows[e.RowIndex].Cells[3].Value = "C";

Note: marking the changed rows with a "C" might be a bit clumsy, but I didn't find a better way. Maybe you have a better one - I'd be glad to get to know about it. 
Calling the MVC from the Application
The whole MVC cluster is called by the standard invocation of the form from wherever (here from the event menu item XYZ_clicked:
        private void defineTeamTypes_Click(object sender, EventArgs e)
 formTeamTypes theTeamTypesForm = new formTeamTypes();


I hope this real-life example adds a bit to Matthew's article and shows how things can be realized when working with a database.

Wednesday, September 5, 2012

Synology DS212j - Experiences and Usage Scenarios

This is the fourth of blog posts wherein I describe my experiences with setting up a Synology disk station DS212j by re-using a hard disk which I recovered from an Apple Time Capsule wlan router & NAS which quitted service. In this post I want to describe my experiences with the new device after having the hardware up and running.

Set-Up and Installation

The setup of the DS is quite simple if you have a SATA hard disk at your disposal which is either empty or does not contain valuable data: the disk will be formatted during the setup process. As described in the first post of this series the installation procedure is started via a Synology desktop application which is called "DS Finder" and which available (at least) for Windows and Mac OS X. The Disk Station is detected via WLAN if it has been connected to a WLAN router. If the DS has not been set-up before, the DS Finder will list it as "Not Installed".
One thing you have to ensure at this point is that you have an installation file of DSM, the Disk Station operating system available, which can be downloaded from Synology. On the next screen you are asked to tell the program where this installation file is located.
After having started the installation process it is time to go for a loooong coffee break. For my 1TB WD hard disk (from the Time Capsule) the formatting and installation process took about 5-6 hours...

System Configuration

When the coffee break is over, it is time to login for the first time to the system. This can be done via the DS Finder tool -which opens a browser window - or via a web browser. The Disk Station is available at
<ip address>:5000
At the first login you have to use the default admin account, which does not have a password per default. The first configuration I did was giving the admin a password...
If you don't want to access the Ds through DS Finder (which I didn't want either) it is good to assign the DS a static IP so you can be sure which IP:5000 combination to type in the browser. One can do this by opening in DSM:

System Control --> System --> Network

On the tab "Network Interface", LAN settings you can choose between automatic IP (DHCP) and manual IP assignment. To assign a static address, choose "manual" and enter network config data of your choice.
Work in progress...

Creation of Users

The next thing to do is to think about users being allowed to access the DS. This can be done via the Control Panel which is available on the DSM desktop:

The standard administrator user "admin" has no password assigned by default. One should change this as soon as possible if unauthorized access shall be avoided.
Other users can be created and managed via the "User" admin tool. It is possible to create users with different levels of authorization and detailed access to folders and programs.

Creation of Disk Spaces and Folders

After a successful system installation it is time to create (logic) drives on the disk. This can be achieved by a program called "Disk Manager" which comes with the standard installation of DSM:

Via Storage Manager it is possible to create logical drives:

Note: on the screenshot, the drive has already has been created.

After the creation of a logical drive it is time to create folders on the drive.

Accessing the Disk Station Content via iPad (iOS Devices)

To begin with: it is not possible to access music files stored on the DS via iTunes on the iPad which seems to have restricted capacitites compared with the full fledged version on a PC. However, it is possible to access Music via iTunes on a Mac or Windows PC (see section below).
Nevertheless, it is possible to stream media files (music, video, photos) to an iOS device via special apps Synology provides for free in the app store. All in all there are 6 apps for iOS available, apps for Android & Windows Phone are available too:

  • DS Audio: streams music to the mobile device in a player similar to iTunes
  • DS Photo+: photo browser for the mobile device
  • DS Video: streams videos stored on the DS to mobile device
  • DS File: gives access to the DS file system
  • DS Download: didn't check

Accessing the Disk Station Content via Play Station

Getting access to my PS3 was a harder one. But this was not so much due to the PS3, but due to the features of the Airport Express which I use as WLAN router and DAC interface to the HiFi amplifier. To the details:
In the beginning, my Airport Express wlan configuration used the "Autochannel" option to determine the wlan channel the Airport uses. Having this option set, it was not possible to get the PS3 connected via wlan. I checed a couple of community threads and finally found a hint that helped me: if you are running your Airport Express / PS3 setup in an environment with some other wlans that interfere, the Airport switches to wlan channel 13. This seems to cause problems with the PS3. I cured this by manually setting the wlan channel of the Airport to some channel lower than 13 and the setup works ever since.

Back to the Synology-experience: After having fixed the network problems, the DS was automatically recognized as media server by the Playstation. for each category of the PS3 content (music, video, photos) a new folder icon appears giving access to the content stored on the DS. I can now stream music, video and photos directly to the PS3 without having the need to store the content there - very satisfying!

Accessing Music on a PC

The integration with a PC (Mac or Windows, Linux I don't know...) is straight forward. You can connect to the DS via a file browser and open files from there. You can open your iTunes library residing on the DS to listen to music´. Alternatively, you can run an iTunes Server on the Disk Station (available for download in the DS control panel). This iTunes Server makes the music stored in a directory called "Music" available as shared iTunes library. Note that this "iTunes Server" does not work as source to stream music to e.g. an iPad. You need to have a fully fledged iTunes on your remote machine to acces this iTunes Server on the DS


In the first blog post after the crash of my Time Capsule I formulated a couple if things I wanted to achieve with a replacement for the device:

  • Separate the disk from the WLAN router to avoid collateral damage if one of the devices breaks
  • We wanted to realize a setup which allows streaming music directly from the disk without the need to have a computer running
  • Connection to our HiFi equipment
  • Price should be kind of reasonable
  • With the TC I did the data backup manually, it would be nice to get this automated in a RAID array
  • What I got from reading a couple of forums, it would be nice if the device to come had an iTunes server on board.
  • I wanted as well to connect a PS3 to the disk to view photos via the PS3
Summarizing I can only say that all of these goals have been achieved. Up to now I cannot judge about the RAID functionality as I did not install a second hard disk yet, but I assume as it is one of the main characteristics of the DS212j that this will work without major problems.

Update from April 2013: I installed a second disk about a half a year ago. My initial assumption was right: istallation and setup of the second disk worked without problems. I created a RAID array with the two disks and it works ever since. Up to now I fortunately didn't have the chance to check that the RAID backup works, as none of the disks crashed.
All in all I am VERY content with the new setup. Streaming to the iPad works perfectly, the output to the HiFi amplifier via the Airport Express works seamless too. The only complaint I have about the Airport is the  limited number (only 1) of ethernet cable sockets the device has. The cooperation between the DS and the PS3 works as well very well, even if there was an Airport-induced challenge to master.

I hope that my article will help you to take your own decision, and you enjoyed reading.

Sunday, September 2, 2012

Nigeria Connection Reloaded - Neue Masche beim Vorschussbetrug

Heute habe ich eine Email mit "lustigem" Inhalt erhalten, von der ich schon gehofft hatte, dass diese Art von Spam endlich zur Vergangenheit gehört. Es scheint sich um eine Mail nach dem System des "Vorschussbetrugs" zu handeln (siehe z.B. Wikipedia). Da ich diese Masche noch nicht kennengelernt hatte, hier ein Bericht als Warnung an alle, die ebenfalls von betrügerischen Emails betroffen sind.

Schritt 1: Kontaktaufnahme Immobilienscout

Die (vermutlichen) Betrüger sind auf mich über eine Anzeige bei Immobilienscout24 aufmerksam geworden, in der meine Email nicht im Angebot erscheint. Schon wenige Stunden nach der Angebotsveröffentlichung erhielt eine normal aussehende Mail, die Interesse an der durch mich inserierten Immobilie signalisierte und um einen Besichtigungstermin bat.

Schritt 2: Antwort auf ein seriös wirkendes Angebot

Erfreut antwortete ich auf die Mail direkt auf meiner Mailbox - Immobilienscout bietet die Möglichkeit, direkt auf den in der Immobilienscout-Email auf einen Link zu klicken, der eine Antwort-Email meinerseits öffnet. Ich bot dem Email-Sender einen Besichtigungstermin in den nächsten Tagen an und wartete auf Anwort...

Schritt 3: Die Antwort kommt anders als erwartet...

Statt einer Bestätigung meines Terminvorschlags erhielt ich folgende Email:

Sehr geehrter Herr / Frau,
Mein Name ist <name>, ein US-Kapitän mit der Afrikanischen Union / United Nations Hybrid Operation in Darfur. Basierend auf den jüngsten US govt Entscheidung über einen Truppenabzug, habe ich geplant, um die Meeresumwelt zu beenden und zu verlagern, um Ihr Land. Lassen Sie mich Ihnen versichern, dass ich gehe, um Ihr Eigentum zu kaufen, aber zuerst möchte ich Ihnen ein Angebot machen. Ich habe in meinem Besitz die Summe von $ 19.8million (USDollar), die von einem unserer Angriffe auf einigen Rebellen Sponsoren und Unterwelt Drogenkartell hier geborgen wurde, weil sie die meisten ihr Geld behalten zu Hause für böse Tätigkeiten wie illegale Angebote für Rohöl , Waffen und Drogen.
Basierend auf den Leiden, die wir durchlaufen hier einige von uns treffen solches Glück. Es passiert, dass ich zu diesem Raid ging mit den Männern in meiner Einheit und ich beschlossen, es als meine Aktien nehmen für meine Stress hier in diesem Bösen und das Böse Land mit Selbstmordattentätern und Heckenschützen gefüllt. Ich legte das Geld mit einem roten Kreuz Agent. Es ist unter meiner Macht zu genehmigen Wer kommt weiter für dieses Geld.
Wo habe ich ein Problem ist, dieses Geld an jemanden dem ich vertrauen kann übertragen. Ich beabsichtige, dieses Geld auf Lager und Immobilien zu investieren. Ich kann mich nicht bewegen dieses Geld in die Vereinigten Staaten, weil ich zu Ihrem Land umziehen wird, wie ich es dort investieren wollen, so brauche ich jemanden, den ich vertrauen, um das Geld zu ihnen zu übertragen für die Verwahrung kann. Wenn Sie annehmen, werde ich das Geld nach Europa zu übertragen, wo Sie der Begünstigte werden, weil ich eine uniformierte Person bin und ich kann nicht paradieren eine solche Menge nur für den Fall, so brauche ich jemanden wie der Empfänger zu präsentieren. Als intelligente amerikanische Offizier, habe ich eine 100% authentische Mittel zur Übertragung der Geld durch einen diplomatischen Mitteln. Ich brauche nur Ihre Akzeptanz und alles ist getan.
Bitte, wenn Sie in dieser Transaktion interessiert sind werde ich Ihnen die kompletten Details, die Sie für uns die Durchführung dieser Transaktion erfolgreich. Ich glaube, ich kann Ihnen vertrauen. Wo wir jetzt sind, können wir nur durch unsere militärischen Kommunikationsmöglichkeiten, die gesichert, damit niemand unsere Gespräche überwachen kann, ist zu kommunizieren, dann kann ich im Detail zu erklären. Ich will nur erreichen Sie durch E-Mails, weil unsere Anrufe könnten überwacht werden, ich habe nur um sicher zu sein, wer ich bin zu tun haben.
Ich tue dies auf Vertrauen, sollten Sie verstehen, und wissen, dass als ausgebildeter Militärexperte Ich werde immer auf Nummer sicher, wenn Sie die schlechte Art sind, aber ich bete Sie sind es nicht. 19.8million Dollar ist eine Menge Geld und ein Traum von jedem.
Ich warte auf Ihre schnelle Antwort, so dass wir gehen können. In weniger als einer Woche sollte das Geld auf Ihrem Pflege festgestellt wurden, und ich werde kommen für mein Geld. Sie werden zu 35% des Geldes berechtigt werden, sobald Sie diesen Deal zu erreichen und 65% für mich ist. Ich hoffe, ich bin fair auf dieser Deal.
Mit freundlichen Grüßen,
Capt <name>.
Wenn ich auf dieses "verlockende" Angebot eingehen würde, würde ich früher oder später höchstwahrscheinlich aufgefordert, meine Bankverbindung anzugeben und "Gebühren" an den "Hilfesuchenden" zu überweisen, damit die "Banktransaktion" angestoßen werden kann. Es ist natürlich klar, dass mein Geld auf Nimmerwiedersehen weg wäre und die Millionensummen niemals auf meinem Konto eingehen würden. Die Masche ist ebenfalls schön durch beschrieben.


Wie man sieht, ist die als "Nigeria-Connection" bekannte Masche wieder verstärkt aktiv. Ich rate jedem davon ab auf eine solche Mail zu antworten. Wer es dennoch getan hat, kann mir ja mal erzählen, was er daraufhin erlebt hat - ich bin gespannt....

Thursday, August 23, 2012

Death of a Time Capsule Part 3 - Rescuing the Data from the TC Disk

This is the third post of a blog about my experiences with replacing an Apple Time Capsule with a Synology Disk Station DS212j. In the previous post I wrote about the first steps of setting up the DS212j, which ended by finding that all data is deleted from the hard disk(s) being installed in the DS - something that was catastrophic for me as some valuable data was on the disk not backed up yet.

What Options Were Available to Save the Data on the former Time Capsule Disk?

First of all a summary of the hardware available for the operation:

  • The 1 TB Western Digital hard disk I recovered from the Time Capsule (see my first post of this series).
  • 1 old Dell PC running under Windows XP. I used this PC to check if the Time Capsule Disk was still alive.
  • 1 USB 1 TB disk which I used to do backups of data on the time capsule. The disk was formatted in Mac OS Extended format.
From my experiences with doing a health check of the Time Capsule SATA disk I knew it could be connected to the Dell PC. Fortunately, the PC had 2 physical hard disks inside which I partitioned in a way that the system and the programs reside on two partitions on the first disk, the second hard disk hosted one larger volume for data.  By this partitioning the Windows system was able to boot, even with the "DATA" disk replaced by the old Time Caplsule disk.

Reading Mac Formatted Hard Disks under Windows XP

The first step was made: the TC disk was physically running in my old Dell PC. Unfortunately, this disk was formatted in a Mac Format (didn't check exactly which) which cannot be read easily under Windows.
Some research in the web led me the the Windows utility "MacDrive" which is available under It can be downloaded as 30-day evaluation version, which is free of charge. I downloaded the program, installed it and after a reboot the Mac formatted disk was available in the Windows explorer :)))) I connected as well the USB-disk (as well Mac formatted) which was available in the explorer as well.
However, the copying of data from the TC disk to the USB drive went not so smooth. It was not possible to simply copy the root folder of the TC disk to the USB drive, because in some cases Windows choked on some path names which were too long. I didn't manage to create a compressed data file from the TC data, as I didn't find a compression program being able to manage roughly 350 GB of data.
In the end I simply copied those folders where I knew that new data since the last backup was in. ...Not very convenient, but the only option I had.

In the End - Happy End

After all, I managed to recue all the data important to me on the USB disk. What saved my in the end was an old Windows PC and the program MacDrive. As I used it only for two days, this very important  tool even didn't cost me anything. Finally, the former Time Capsule disk was ready to be re-used in the newly purchased empty Synology DS212j case.

I will write another post about my experiences with the DiskStation up and running.