error Archives - ISbyR https://isbyr.com/tag/error/ Infrequent Smarts by Reshetnikov Sat, 21 Dec 2024 13:57:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 “Create a Custom Skill for Azure AI Search” lab fails https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/ https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/#respond Sat, 21 Dec 2024 13:57:30 +0000 https://isbyr.com/?p=1208 I tried to follow the “Create a Custom Skill for Azure AI Search” but it failed with this error “The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.” If you try to follow the “Create a Custom … Continue reading “Create a Custom Skill for Azure AI Search” lab fails

The post “Create a Custom Skill for Azure AI Search” lab fails appeared first on ISbyR.

]]>
I tried to follow the “Create a Custom Skill for Azure AI Search” but it failed with this error “The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.”

If you try to follow the “Create a Custom Skill for Azure AI Search” lab that is part of the “Implement knowledge mining with Azure AI Search” course it fails (at least until Microsoft updates the lab files as per my PR).

There are 2 issues in the update-skillset.json file that is part of this lab

Issue and Error #1

The request is invalid. Details: The property ‘includeTypelessEntities’ does not exist on type ‘Microsoft.Skills.Text.V3.EntityRecognitionSkill’. Make sure to only use property names that are defined by the type.

The request is invalid. Details: The property 'includeTypelessEntities' does not exist on type 'Microsoft.Skills.Text.V3.EntityRecognitionSkill'. Make sure to only use property names that are defined by the type.

To fix it remove the line that contains includeTypelessEntities in the JSON file mentioned above.

This is due to a depreciation of this parameter by Microsoft

Issue and Error #2

If you try to run the update-skillset script again, after fixing the first error, you will be greeted by error #2:

“One or more skills are invalid. Details: Error in skill ‘#1’: Outputs are not supported by skill: entities”

To fix it, a few lines below the line that you’ve just removed under the outputs section, replace

"name": "entities"

with

"name": "namedEntities"

Since that is the available output name of this API.

That’s it folks, enjoy

More posts related to my AI journey:

The post “Create a Custom Skill for Azure AI Search” lab fails appeared first on ISbyR.

]]>
https://isbyr.com/create-a-custom-skill-for-azure-ai-search-lab-fails/feed/ 0
Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/ https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/#respond Fri, 05 Jan 2024 12:29:25 +0000 https://isbyr.com/?p=1113 Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK. Sometimes when using Azure AI Python SDK you will not get the expected result, meaning that the reason property of the result of the analyze method of the ImageAnalyzer class the property will not be equal to sdk.ImageAnalysisResultReason.ANALYZED. Phew, that’s a mouthful, easier to show it code: … Continue reading Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK

The post Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK appeared first on ISbyR.

]]>
Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK.

Sometimes when using Azure AI Python SDK you will not get the expected result, meaning that the reason property of the result of the analyze method of the ImageAnalyzer class the property will not be equal to sdk.ImageAnalysisResultReason.ANALYZED.

Phew, that’s a mouthful, easier to show it code:

...
image_analyzer = sdk.ImageAnalyzer(cv_client, image, analysis_options)

result = image_analyzer.analyze()
    
if result.reason == sdk.ImageAnalysisResultReason.ANALYZED:
...

The condition in the last line will not be true.

So you would like to actually see what was it

print(f'ResultReason = {result.reason}')

That will give us the reason

Well that’s not too useful, is it?

Let’s get the actual error behind the reason

result_details = sdk.ImageAnalysisResultDetails.from_result(result)
print(f'Result Details = {result_details.json_result}')

And voila: No free soup Analyze Operation under Computer Vision API for you.

More posts related to my AI journey:

The post Getting ImageAnalysisResultDetails in Azure AI Vision Python SDK appeared first on ISbyR.

]]>
https://isbyr.com/getting-imageanalysisresultdetails-in-azure-ai-vision-python-sdk/feed/ 0
Azure: Invalid user storage id or storage type is not supported https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/ https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/#comments Wed, 27 Dec 2023 05:20:55 +0000 https://isbyr.com/?p=1086 I was trying to update my Azure Language service to enable Custom text classification / Custom Named Entity Recognition. That feature requires a storage account. While you are supposed to be able to create the storage account when you enable the feature it didn’t work for me 🙁 (I was getting an “Invalid user storage … Continue reading Azure: Invalid user storage id or storage type is not supported

The post Azure: Invalid user storage id or storage type is not supported appeared first on ISbyR.

]]>
I was trying to update my Azure Language service to enable Custom text classification / Custom Named Entity Recognition. That feature requires a storage account. While you are supposed to be able to create the storage account when you enable the feature it didn’t work for me 🙁 (I was getting an “Invalid user storage id or storage type is not supported” error).

Problem 1: “Invalid user storage id or storage type is not supported”

As part of learning a bit about Azure AI Services, I was doing a Classify Text exercise, but since it is one of a few prior exercises I was not creating a new Language service from scratch, but rather re-using an existing one. So I needed to enable the Custom text classification…. feature.

Azure Language Service Features page

I would click the Create a new storage account link and fill in all the details. and will click Apply on the Azure Language service Features page.

But, it would almost immediately error out with an “Invalid user storage id or storage type is not supported” message

Invalid user storage id or storage type is not supported

Solution for: “Invalid user storage id or storage type is not supported”

The solution was simple: create a new Azure Storage Account and then select it from the drop-down list (or use one of the existing ones).

Problem 2: blob containers are not visible when creating a new project

Next, when I was trying to create the Custom text single label classification project I was supposed to select a container (from an already pre-filled storage account), but it wasn’t visible.

Solution for: blob containers are not visible when creating a new project

Make sure that the managed identity has the necessary permissions.

In Azure Portal go to your storage account > Access Management (IAM).

Click Add > Add role assignment

In the Role section search for and select Storage Blob Data Owner

Under the Members section, select Managed Identity and click Select members.

Then on the right select the correct subscription, Language, and the correct resource.

Select, apply, etc. until the role is assigned and you will be able to pick up the blob container in the Azure Language services wizard that is used to create a new project.

Problem 3: “A server error occurred. Please refresh the page and try again”

After going through the wizard, it looks like the project is finally created, but when you click it, the following (very informative 😕) error pops up on the right.

Solution for: “A server error occurred. Please refresh the page and try again”

Add CORS for Language service endpoint to the Azure Storage account.

In Azure Portal go to your storage account > Resource Sharing (CORS)

Fill in:

  • Allowed origins: https://language.cognitive.azure.com
  • Allowed methods: DELETE, GET, PUT
  • Allowed headers: *
  • Max age: 500

Click Save

You might need to recreate the Custom Text Classificaion project either via UI or using a REST call like the one below:

curl -X PATCH  https://<YOU_LANGUAGE_SERVICE_URL_PREFIX>.cognitiveservices.azure.com/language/authoring/analyze-text/projects/<PROJECT_NAME>?api-version=2022-05-01 -H 'Content-Type: application/json' -H 'Accept: application/json' -H 'Ocp-Apim-Subscription-Key: <YOUR_LANGUAGE_SERVICE_API_KEY>' -d '{"projectName": "<PROJECT_NAME>", "language": "en-us", "projectKind": "CustomSingleLabelClassification", "description": "<PROJECT_DESCRIPTION>", "multilingual": false, "storageInputContainerName": "<BLOB_CONTAINER_NAME>"}'

P.S.

I probably could have avoided all these problems, if I had just created a new Azure language service from scratch, following the training, instead of re-using an existing Azure language service….., but then what would I be writing here instead? 🤔

More posts related to my AI journey:

The post Azure: Invalid user storage id or storage type is not supported appeared first on ISbyR.

]]>
https://isbyr.com/azure-invalid-user-storage-id-or-storage-type-is-not-supported/feed/ 2
AWS VolumeModificationSizeLimitExceeded https://isbyr.com/aws-volumemodificationsizelimitexceeded/ https://isbyr.com/aws-volumemodificationsizelimitexceeded/#respond Mon, 09 Dec 2019 06:01:42 +0000 http://isbyr.com/?p=571 If you are dealing with big amounts of EBS volume and need to extend it you might face one day an AWS VolumeModificationSizeLimitExceeded error. We are occasionally extending our EBS volumes, It is done by updating the CloudFormation for these instances and then there is a script (that was written by people much more AWS knowledgable … Continue reading AWS VolumeModificationSizeLimitExceeded

The post AWS VolumeModificationSizeLimitExceeded appeared first on ISbyR.

]]>
If you are dealing with big amounts of EBS volume and need to extend it you might face one day an AWS VolumeModificationSizeLimitExceeded error.

We are occasionally extending our EBS volumes, It is done by updating the CloudFormation for these instances and then there is a script (that was written by people much more AWS knowledgable than me) that checks CloudFormation parameters and if it sees that volume size has increased it will do all the AWS and Linux “black magic” to actually extend the volumes and make the OS aware of it.

One happy day we were extending volumes for our 60 members strong fleet of EC2 instances from 2000GB to 3000GB.  So I’ve updated the CloudFormation and the extension kicked off.. On 51 out of 60 instances it has completed successfully, but on the rest, I could see that cfn-script was complaining “An error occurred (VolumeModificationSizeLimitExceeded) when calling the ModifyVolume operation: You have exceeded your maximum gp2 volume modification storage limit of 100 TiB in this region. Please contact AWS Support to request an Elastic Block Store volume modification storage limit increase or retry once existing volume modifications have completed”

Apparently, there is a limit on the “original” total volumes size that can be extended in one go and the default limit is 100TB and which we have hit:  52 x 2000GB = 101.56TB.

It is mentioned in https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html as “Maximum modifying storage”

VolumeModificationSizeLimitExceeded
Maximum Modifying Storage

(As of December 2019) I couldn’t find where one can see the current limit and the only way to increase it is by logging a support case with AWS. By the way, you will need to log a “generic” support case and not a “limit increase” case as this limit is not available for selection.

Also if you plan to extend from more than 300TB in one go you better contact AWS as soon as possible as here is what I’ve got from AWS  Support when trying to increase the value to 400TB

“With regards to your question, the 300TB is the value we can provide at this point because request with higher value requires further review/approval from our EBS team. The outcome depends on their review and we don’t guarantee whether the request will get approved or not.”

P.S. you’ve reached the maximum modification rate per volume limit. wait at least 6 hours between modifications per ebs volume.

The “you’ve reached the maximum modification rate per volume limit. wait at least 6 hours between modifications per ebs volume.” error is different and happens when you are trying to increase an EBS volume again within less than 6 hours from the previous time you’ve modified the same volume.

Update October 2022

These limits are now better documented at: https://docs.aws.amazon.com/general/latest/gr/ebs-service.html and it seems that one can request the limit increase using a dedicated “limit increase” case.

VOLUMEMODIFICATIONSIZELIMITEXCEEDED

The post AWS VolumeModificationSizeLimitExceeded appeared first on ISbyR.

]]>
https://isbyr.com/aws-volumemodificationsizelimitexceeded/feed/ 0