Verify an Emirates ID via a Free #API

Using RapidAPI – on this page: https://rapidapi.com/fvetroo5ri/api/verify-emirates-id
There is a free API that returns the name / address / dob from an emirates ID (think UAE / Dubai / Abu Dhabi), sample code as follows:
https://rapidapi.com/fvetroo5ri/api/verify-emirates-id
curl --request GET \
--url 'https://verify-emirates-id.p.rapidapi.com/default/EmiratesID?eid=784197600000000' \
--header 'X-RapidAPI-Host: verify-emirates-id.p.rapidapi.com' \
--header 'X-RapidAPI-Key: KEY GOES HERE'
and it returns data in the format;
{
"FirstName": "Mohammed",
"LastName": "Fatah",
"DateOfBirth": "1979-05-05",
"Emirate": "Dubai",
"Address": "Sheikh Mohammed Bin Rashed Boulevard Downtown Dubai, PO Box 123234 Dubai, UAE"
}
Very nice ... could be useful for KYC:
https://rapidapi.com/fvetroo5ri/api/verify-emirates-id
License plate lookup #API now added support for #Tunisia

We are excited to announce that our License Plate Lookup API has now extended its services to include support for Tunisia. This latest update brings a valuable tool to the Tunisian market, enabling users to retrieve detailed vehicle information from a license plate number.
Technical Details of the Tunisia Support
Our API, accessible at the endpoint https://www.vehiculeapi.tn/api/reg.asmx?op=CheckTunisia, can now process car registration plates from Tunisia and return comprehensive vehicle details. This enhancement is tailored to meet the specific format of Tunisian license plates.
Understanding Tunisian Plates
Tunisian number plates are unique in their format, reading from right to left. For instance, the letters “TU” represent “تونس”, and “RS” stands for “ت ن”. A sample registration number like “818TU223” could also be interpreted as:
| 223 | تونس | 818 |
Data Returned by the API
Upon querying a Tunisian license plate, the API will return a wealth of information about the vehicle, including:
- Make / Model: Identifies the brand and model of the car.
- Year: The year of registration.
- Fuel Type: Details the type of fuel the vehicle uses.
- Variant: Specifies the variant of the model.
- Engine Type: Provides information about the engine.
- Representative Image: A URL to an image of the vehicle.
Sample JSON Response
For a better understanding, here’s a sample JSON response for the registration number “818TU223”:
{
"Description": "Kia PICANTO",
"RegistrationYear": "2017",
"RegistrationDate": "01-01-2017",
"CarMake": {
"CurrentTextValue": "Kia"
},
"CarModel": {
"CurrentTextValue": "PICANTO"
},
"MakeDescription": {
"CurrentTextValue": "Kia"
},
"ModelDescription": {
"CurrentTextValue": "PICANTO"
},
"FuelType": "Essence",
"Variant": "III (JA) ( 03-2017 > )",
"Engine": "1.0 67ch ( 03-2017 > ---- ) ",
"Type": "VOITURE PARTICULIERE",
"FiscalPower": "0",
"ImageUrl": "http://www.vehiculeapi.tn/image.aspx/@S2lhIFBJQ0FOVE8="
}
This new feature is a significant step forward in our commitment to providing comprehensive and user-friendly vehicle information services worldwide. It opens up a new avenue for users in Tunisia to easily access detailed vehicle information with just a license plate number.
Stay tuned for more updates and enhancements to our License Plate Lookup API!
Using the PayPal REST #API in C#

Some of this code requires custom-built libraries, so it’s not a great reference, but conceptually, you should be able to see how to use the PayPal REST API in C# using this technique
First, Authentication, you need to get a bearer token before you can all any interesting API endpoints;
private static string Authenticate(Environment environment)
{
var credential = Sandbox;
if (environment == Environment.LIVE) credential = Live;
var auth = Convert.ToBase64String(Encoding.UTF8.GetBytes(
credential.UserName + ":" + credential.Password));
var http = new HTTPRequest
{
HeaderHandler = h => new NameValueCollection
{
{ "Authorization", "Basic " + auth }
}
};
var apiAuthEndpoint = "https://api-m.sandbox.paypal.com/v1/oauth2/token";
if (environment == Environment.LIVE)
{
apiAuthEndpoint = "https://api-m.paypal.com/v1/oauth2/token";
}
const string strGrant = "grant_type=client_credentials";
var authResponse = http.Request(apiAuthEndpoint, "POST", strGrant);
var jAuthResponse = JObject.Parse(authResponse);
var token = jAuthResponse["access_token"] + "";
return token;
}
Once you have a token, then you can create an order, which in effect gives you a URL to send the user to, and an ID that you should keep for future reference;
var token = Authenticate(environment);
var intPackageAmount = 100;
var packageCurrency = "USD";
var intent = new
{
intent = "CAPTURE",
purchase_units = new[]
{
new
{
amount = new
{
currency_code = packageCurrency,
value = intPackageAmount
}
}
},
payment_source = new
{
paypal = new
{
experience_context = new
{
payment_method_preference = "UNRESTRICTED",
landing_page = "LOGIN",
user_action = "PAY_NOW",
return_url = domain + "/payments/paypalOrderComplete.aspx",
cancel_url = domain
}
}
}
};
var json = JsonConvert.SerializeObject(intent, Formatting.Indented);
var apiEndpoint = "https://api-m.sandbox.paypal.com/v2/checkout/orders";
if (environment == Environment.LIVE)
{
apiEndpoint = "https://api-m.paypal.com/v2/checkout/orders";
}
var http = new HTTPRequest
{
HeaderHandler = h => new NameValueCollection
{
{ "Authorization", "Bearer " + token }
},
ContentType = "application/json"
};
var orderSetupJson = http.Request(apiEndpoint, "POST", json);
var jOrderSetup = JObject.Parse(orderSetupJson);
var paypalId = jOrderSetup["id"] + "";
var linksArray = jOrderSetup["links"] as JArray;
if (linksArray == null) throw new Exception("Paypal Order setup failed");
var url = "";
foreach (var link in linksArray)
{
if (link["rel"] + "" != "payer-action") continue;
url = link["href"] + "";
break;
}
The Option that said “unrestricted”, helped later on with the capture part.
Once you’ve sent the user to the URL provided by paypal, and it’s returned to your return url, it should also provide you with a token in the Querystring, which matches the id of the order. – I’ve omitted the bit where this is held in a database while the user makes the payment.
Once it comes back, you need to capture the payment as follows;
// Capture payment
var apiCaptureEndpoint = "https://api-m.paypal.com/v2/checkout/orders/{0}/capture";
if (environment == Environment.LIVE)
{
apiCaptureEndpoint = "https://api-m.paypal.com/v2/checkout/orders/{0}/capture";
}
apiCaptureEndpoint = string.Format(apiCaptureEndpoint, paypalOrderId);
var captureResponse = http.Request(apiCaptureEndpoint, "POST", "");
var jCaptureResponse = JObject.Parse(captureResponse);
status = jCaptureResponse["status"] + "";
Status should be “COMPLETED” after this process is completed, and the money should be transferred.
Test first on sandbox before going live.
Transfer #IIS bindings from one server to another using #Powershell

Ok, it’s a common task, you’re migrating from one server to another, but you have one website that responds to 100’s of bindings, and you have to move them. Of course you can copy and paste one by one, but here’s a script to do so. It also works with IDN domains.
First, on the source server, run this;
Import-Module WebAdministration
$siteName = "YOUR_SITE.COM"
$exportPath = "C:\TEMP\bindings.csv"
$bindings = Get-WebBinding -Name $siteName |
Where-Object { $_.protocol -eq 'http' } |
Select-Object protocol, bindingInformation
$bindings | Export-Csv -Path $exportPath -NoTypeInformation -Encoding UTF8
Then, copy the file bindings.csv to the new server, and import them using this
Import-Module WebAdministration
$siteName = "YOUR_SITE.com"
$importPath = "C:\temp\bindings.csv"
$bindings = Import-Csv -Path $importPath -Encoding UTF8
foreach ($binding in $bindings) {
$protocol = $binding.protocol
$bindingInformation = $binding.bindingInformation -split ':'
$ipAddress = $bindingInformation[0]
$port = $bindingInformation[1]
$hostHeader = $bindingInformation[2]
New-WebBinding -Name $siteName -Protocol $protocol -IPAddress $ipAddress -Port $port -HostHeader $hostHeader
}
This doesn’t work for HTTPS domains, since you need to copy the public/private keys as well, and it’s more complex.
Optimizing #MySQL performance on Windows

The MySQL community edition, when installed using default settings is limited to 128MB of memory, which makes it unobtrusive, and won’t hog resources if misused, which is fine. But sometimes you need a blast of performance, to help run queries fast, even if it gets greedy with memory.
Everybody knows that memory is faster than disk, so if you find that MySQL is using 100% (or maxing out), disk usage, and only using 128MB of memory, then you will benefit from giving MySQL more access to available memory. Lets say your desktop machine is 16GB, you can easily give it 10GB (10G) of memory without affecting system stability. Ypu can see all of this in task manager.
So, assuming you’re using Windows, you need to open notepad (or another text editor) as Adminsistrator and then open the file C:\ProgramData\MySQL\MySQL Server 8.0\my.ini
Find the part that says
innodb_buffer_pool_size=….
and change it to
innodb_buffer_pool_size=10G
Then stop and start the MySQL service (Using Services.msc).
As you start running heavy queries, you should see the memory usage of MySQLd go up, and the disk usage go down.
gnutls_handshake() failed: Handshake failed – #GIT #error #Ubuntu #BitBucket

When trying to connect via GIT to BitBucket from an older server, I got this error;
fatal: unable to access xxx : gnutls_handshake() failed: Handshake failed
After updating GIT, CA root certifiates, rebooting the server, nothing seemed to work.
Then I did;
Get the SSH public key as follows;
cat /home/ubuntu/.ssh/id_rsa.pub
Then log into bitbucket, and press the settings cog in the top right, then “Personal Bitbucket settings”, then SSH Keys,
then paste in the public key from the result above.
Once added, you can do;
git clone git@bitbucket.org:XXX/XXX
Hope this helps someone!
How to transfer an #S3 bucket from one account to another

Transferring S3 bucket from one AWS account to another is a pretty common action, and the AWS documentation on this seems to be quite lacking.
At a high level, you need to give destination account READ access to the source account’s bucket, and give the source account WRITE access to the destination account’s bucket. In this way the destination does the reading, and the source does the writing. This means the whole operation can be performed by AWS S3 internally, without the data flowing to an intermediary service.
The approach below is not exactly “least privilige”, so I’m assuming you trust the source and destination accounts.
So, here I’m going to go from SOURCE-BUCKET to DESTINATION-BUCKET and the AWS Account ID of the source is 1111111 and the AWS account ID on the destination is 2222222 – You will obviously need to replace these placeholders with your own values.
So, under the source bucket – click permissions, then Edit under bucket policy then paste this JSON;
{
"Version": "2012-10-17",
"Id": "CrossAccountRead",
"Statement": [
{
"Sid": "AllowCrossAccountRead",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::2222222:root"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::SOURCE-BUCKET/*"
}
]
}
Then on the destination bucket, do the same in reverse;
{
"Version": "2012-10-17",
"Id": "CrossAccountWrite",
"Statement": [
{
"Sid": "AllowCrossAccountWrite",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::1111111:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::DESTINATION-BUCKET/*",
"arn:aws:s3:::DESTINATION-BUCKET"
]
}
]
}
Then, back on the source account again, run the following command in the AWS CLI;
aws s3 sync s3://SOURCE-BUCKET s3://DESTINATION-BUCKET
Change to #Azure #Devops #API, vso.loadtest scope no longer valid

I use Azure Devops to host some of my code, and then to automate deployments, I use the Azure Devops API, which up to now has worked great. Today, suddenly I got the error;
?error=InvalidScope
Returned, appended to the callback URL after authorization. I hadn’t changed anything, so automatically presumed something must have changed on the Azure side.
I logged in, everything seemed fine, I made a minor (non-breaking) change to the app definition, and pressed “Save changes”, then I saw the above error “Scope is not vald, Cannot mix uri based and modern scopes ‘vso.loadtest’“
Granted, I didn’t actually need the scope vso.loadtest, but I had over-enabled the scopes just to get the thing working, and you can’t edit the scopes once the app is created.
So, I had to create a new app, with the appropriate scopes, but everything else identical, copied the new App id, app secret and client secret to to the web.config on my client, and still broken.

However, this turned out to be a temporary outage on the Azure side, and it worked again shortly afterwards. I wonder if the two events are related?
Visual Studio debugger not hitting breakpoints

This happened on a Xamarin based project, connected to a real android device on VS 2022. Where no breakpoints were being hit on the projecct, even though the application deployed and ran on the device.
Here, thanks to this post: https://github.com/xamarin/xamarin-android/pull/6660 I enabled “Use Fast Deployment” – see screenshot above, and the breakpoints started to get hit again, and it deployed faster to the device.
Find all webpages hosted on a domain via an API

If you want to find all webpages that are hosted on a given domain, then you can use the Site: prefix in Google or Bing
However, let’s imagine you want a more extensive list, and perhaps, you want the result back in JSON format, such that you can use it in your own applications, here is where the WayBackMachine (Internet Archive) can be useful.
So, imagine that you want to see what pages are (or were) hosted on the domain webtropy.com; you’d use the url
This returns a JSON array, of webpages on that domain divided by year. Of course, you can’t be sure that all of those pages are live, but you can know that they were live on the year specified, so you can focus on the latest year. There will also be plenty of duplicates, so you’ll need to eliminate these too.
This appears to be an unofficial API, so it is subject to change without warning. However, I hope it is useful!