Splunk Connect for Kafka (aka SC4K) allows to collect events from Kafka platform and send them to Splunk. While the sending part (to Splunk) was pretty straight forward to me, the collection part (from Kafka) was very new, as I’ve had no experience with Kafka eco-system. So I guess will start with it.
When having an exam with Pearson VUE, you would only get pass/faill result, but what if you want to know which section of the exam you have scored low and you want to brush up on the relevant skills? Here is how to get score breakdown for Pearson VUE exam.
If you are dealing with big amounts of EBS volume and need to extend it you might face one day an AWS VolumeModificationSizeLimitExceeded error.
We are ocasionaly extending our EBS volumes, It is done by updating the CloudFormation for these instances and then there is a script (that was written by people much more AWS knowledgable then myself) that checks CloudFormation parameters and if it sees that volume size has increased it will do all the AWS and Linux “black magic” to actually extend the volumes and make the OS aware of it.
One happy day we were extending volumes for our 60 members strong fleet of EC2 instances from 2000GB to 3000GB. So I’ve updated the CloudFormation and the extension kicked off.. On 51 out of 60 instances it has completed successfully, but on the rest I could see that cfn-script was complaining “An error occurred (VolumeModificationSizeLimitExceeded) when calling the ModifyVolume operation: You have exceeded your maximum gp2 volume modification storage limit of 100 TiB in this region. Please contact AWS Support to request an Elastic Block Store volume modification storage limit increase or retry once existing volume modifications have completed”
Apparently there is a limit on the “original” total volumes size that can be extended in one go and the default limit is 100TB and which we have hit: 52 x 2000GB = 101.56TB.
It is mention in https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html as “Maximum modifying storage”
(As of December 2019) I couldn’t find where one can see the current limit and the only way to increase it is by logging a support case with AWS. By the way you will need to log a “generic” support case and not a “limit increase” case as this limit is not available for selection.
Also if you plan to extend from more then 300TB in one go you better contact AWS as soon as possible as here is what I’ve got from AWS Support when trying to increase the value to 400TB
“With regards to your question, the 300TB is the value we can provide at this point because request with higher value requires further review/approval from our EBS team. The outcome depends on their review and we don’t guarantee whether the request will get approved or not.”