Archive

Archive for August, 2020

Convert French Siret/Siren to VAT (TVA) in C#

This is probably well known to french people, but you can convert a french company ID (Siret or Siren) to a french VAT number with some simple code, as follows in c#

The relevant function is as follows;

public static string VatFromSiret(string siret)
{
if (siret.Length > 9) siret = siret.Substring(0, 9); // siren
var numSiret = Convert.ToInt32(siret);
var validation = (12 + 3 * (numSiret % 97)) % 97;
return "FR" + validation + siret;
}

Here is the Github repo, for anyone who’s interested;

https://github.com/infiniteloopltd/VatFromSiret/

Categories: Uncategorized

Encryption at rest #MySQL, before and after

If you don’t encrypt data at rest in MySQL, then potentially secure information can be easily extracted from the underlying “ibd” files. This scenario may occur, if someone had access to the filesystem of your server, but not necessarily access to MySQL itself.

Here, I’ve simply created a new database, called “superSecure”, with one table called “Passwords”, which has one column called “Password”, and one row containing the text “YELLOW_SUBMARINE”, by running a simple “cat” commad on the ibd file, you can clearly see the text “YELLOW_SUBMARINE” in the text.

However, with these commands, we can encrypt the underlying data;

INSTALL PLUGIN keyring_file SONAME 'keyring_file.so';
SET GLOBAL keyring_file_data = '/var/lib/mysql-keyring/keyring';
ALTER TABLE PASSWORDS ENCRYPTION='Y';

Once these commands are complete; and we try to view the ibd file again,

There is no plain text that can be viewed in the file.

Evidently, this is not foolproof, but it’s one simple way to help secure your data.

Categories: Uncategorized

Microsoft Translation #API using #Azure #CognitiveServices

When it comes to translation APIs, then Google translate is perhaps the first that comes to mind, but there is also Microsoft Azure translation API, under their cognitive Services umbrella.

Here is some simple code to implement the API, – I’ve omitted my API keys, and assuming you are using the WesternEurope (Ireland) endpoint.

ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3 |
                                       SecurityProtocolType.Tls |
                                       SecurityProtocolType.Tls11 |
                                       SecurityProtocolType.Tls12; 

const string strUrl = "https://api.cognitive.microsofttranslator.com/translate?api-version=3.0&to=es&from=en";
var wc = new WebClient();
wc.Headers["Ocp-Apim-Subscription-Key"] = key;
wc.Headers["Ocp-Apim-Subscription-Region"] = "westeurope";
wc.Encoding = Encoding.UTF8;
var jPost = new [] { new { Text = "Hello, what is your name?"}};
var strPost = JsonConvert.SerializeObject(jPost, Formatting.Indented);
wc.Headers[HttpRequestHeader.ContentType] = "application/json";
var json = "";
try
{
    json = wc.UploadString(strUrl, "POST", strPost);
  
}
catch (WebException exception)
{
    string strResult = "";
    if (exception.Response != null)
    {
        var responseStream = exception.Response.GetResponseStream();
        if (responseStream != null)
        {
            using (var reader = new StreamReader(responseStream))
            {
                strResult = reader.ReadToEnd();
            }
        }
    }
    throw new Exception(strResult);
}

var jResponse = JArray.Parse(json);
var translation = jResponse.First["translations"].First["text"].ToString();

It also requires the Newtonsoft.JSON Nuget package.

Hope this is useful to someone.

Categories: Uncategorized

Using #AWS #SQS in C# – Decouple database inserts

Amazon SQS or Simple Queue Service is AWS’s answer to Microsoft’s MSMQ, for those who remember it. It’s a great way to decouple an application from database inserts.

So, for example, if your application logs activity to a database, and you want to seperate conerns, so that in the event of a database outage, you don’t loose your data.

Because typically, you can do only one of two things, in the case of a database outage in this case, you can ignore the data, and loose it. or you can keep it in memory, and hope that the application pool doesn’t recycle or server doesn’t reboot. Or you can store it in local persistant storage – but this is going to start becoming complicated to manage.

SQS offers a way that you can send your data to a queue, and some other process can read it back out, and store it in a database. If that process fails to write the data, it just leaves it in the queue, until such time as the database is back online.

So, without further ado, here is the code;

First install the Nuget package for AWS SQS:

Install-Package AWSSDK.SQS

Then create the SQS client as follows (I’ve omitted my IAM user and password)

public static AmazonSQSClient CreateClient()
{
var sqsConfig = new AmazonSQSConfig
{
RegionEndpoint = RegionEndpoint.EUWest1
};
const string iamUser = "xxxxx";
const string iamPass = "xxxxx";
var awsCredentials = new BasicAWSCredentials(iamUser, iamPass) ;
return new AmazonSQSClient(awsCredentials, sqsConfig);
}

Then, here I am writing some code to write a new message to the queue

const string strSqsUrl = "https://sqs.eu-west-1.amazonaws.com/005445879168/helloWorld";
          
var client = CreateClient();

var sendMessageRequest = new SendMessageRequest
{
 QueueUrl = strSqsUrl,
mMessageBody = "Hello World"
};

Console.WriteLine("Sending 'hello world' to queue");
var msg = client.SendMessageAsync(sendMessageRequest).Result;

Console.WriteLine("Sent with messageId:" + msg.MessageId);

Then, in the same function, I’ll read it back (here it would be a seperate process)

var latestMessages = client.ReceiveMessageAsync(new ReceiveMessageRequest(strSqsUrl)).Result;

onsole.WriteLine("Reading messages back from queue:");
foreach (var message in latestMessages.Messages)
{
 Console.WriteLine(message.Body);
}

Now, if you’re happy that everything has worked well, you can delete the message from the queue.

 Console.WriteLine("Deleting from queue");
client.DeleteMessageAsync(new DeleteMessageRequest(strSqsUrl, msg.MessageId));

And, that’s the basics of how to use SQS. .

You shouldn’t use SQS for long-term storage (>4 days), and you shouldn’t put large (>256K) messages in the queue. Although these defaults are configurable (max 14 days), But it works very well for what it’s designed for.

Categories: Uncategorized