Updating the MdB via API is a simple way to automate this otherwise laborious task. Please work with people experienced in communicating with API's online. Each API session needs to be authenticated using AXA: Atlas XML API, and all data sent with this API must be encoded as UTF8.
Please mind that all API requests are sent to one web server, meaning that if you have child sites, you need to use the URL for each child site. A communication operator with four sites will this need to run this API on four different URL's
Here is some example PHP code for using insert_posts():
<?php$api_user = 1234; $api_key = "xxx"; $api_hash = ""; $api_url = "http://demo.stadsnatswebben.local/admin/edit/mdb_buildings/plain/api"; class restapi { public $status; private $url; private $curl; function __construct($url){ $this->url = $url; $this->curl = curl_init($this->url); curl_setopt($this->curl, CURLOPT_POST, true); curl_setopt($this->curl, CURLOPT_RETURNTRANSFER, true); curl_setopt($this->curl, CURLOPT_COOKIE, "XDEBUG_SESSION=yes"); } function exec($data, $method = "get"){ if ($method == "get"){ $url = parse_url($this->url); $url = $url["scheme"] . "://" . $url["host"] . $url["path"] . "?" . http_build_query($data); curl_setopt($this->curl, CURLOPT_URL, $url); curl_setopt($this->curl, CURLOPT_POST, false); } else { curl_setopt($this->curl, CURLOPT_URL, $this->url); curl_setopt($this->curl, CURLOPT_POST, true); curl_setopt($this->curl, CURLOPT_POSTFIELDS, http_build_query($data)); } $response = curl_exec($this->curl); $xml = simplexml_load_string($response, null, LIBXML_NOCDATA | LIBXML_NOBLANKS); $response = json_decode(json_encode($xml), true); $this->status = $response[$data["method"]]["status"]; return $response[$data["method"]]["response"] ?? false; } } $client = new restapi($api_url); if ($login = $client->exec(["method" => "api_login", "api_user" => $api_user, "api_key" => $api_key])){ print $login["message"] . "\n"; $api_hash = $login["hash_key"]; print "HASH: " . $login["hash_key"] . "\n"; if ($client->status == "success"){ $file = $argv[1] ?: "addresses.csv"; if (($handle = fopen($file, "r")) !== false){ $nr = 0; while (($data = fgetcsv($handle, 1000, ";")) !== false){ $nr++; if ($nr == 1){ $keys = $data; } else { $data = array_combine($keys, $data); $inserts["line$nr"] = $data; if ($nr % 100 == 0){ $status = insert_posts($inserts); unset($inserts); print "$nr: $status\n"; } } } if ($inserts){ $status = insert_posts($inserts); print "$nr: $status\n"; } } } } else { print "Could not login\n"; } function insert_posts($inserts){ global $client, $api_hash; $post = [ "method" => "insert_posts", "hash" => $api_hash, "posts" => $inserts ]; $inserted = $client->exec($post, "post"); if (!is_array($inserted)){ $return = "PARSER ERROR"; } else { $return = $client->status . CSPACE . $inserted["message"]; foreach ($inserted["log"] as $nr => $message){ if ($message != "Inserted successfully"){ $return .= "\n $nr - $message\n"; } } } return $return; } ?>
Timing
It is our explicit recommendation that you run your import script during the timeframe between midnight to 7 am in the morning. The reson for this is that we run an aggregation script for XSP export at 7 am, and if you have run your import, the daily data will include that for the service providers in the following day.Another reason is that if you were to run this during daytime, and you empty the database and insert every row, then you'd have a empty or incomplete database during a time where end customer could be searching for their address. Also, during nighttime the resource pressure on the server is much lower.