By using AWS re:Post, you agree to the Terms of Use
/.NET on AWS/

Questions tagged with .NET on AWS

Sort by most recent
  • 1
  • 90 / page

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Send-SSMCommand 'AWS-ConfigureAWSPackage' -parameters @HashTableOfPackageNameActionInstallType errors that document does not support parameters

I am attempting to use PowerShell Tools (aws.tools) module version 4.1.42 and PowerShell core 7.2.5 to send a RunCommand to EC2 instance to install the AmazonCloudWatchAgent to an EC2 instance. The error is that 'Send-SSMCommand: document AWS-ConfigureAWSPackage does not support parameters'. It MUST support parameters or else how would the document know what package name/action/installType to install? Does this commandlet need to be updated? The document DOES accept parameters because this works in the awscli via aws ssm send-command --document-name 'AWS-ConfigureAWSPackage' --parameters '{action/packagename/installtype/etc} So the document does accept parameters or else the awscli v PowerShell Tools passes the -parameters flag differently to the AWS cli. The actual code I am running is below. (the parameter string is stolen directly from the SSM run command UI at the bottom of the runcommand page when you select your options so you can run it via aws cli, I just stole the parameter argument) $params = '{"action":["uninstall"],"installatinType":[Uninstall and reinstall"],"version":[""],"additionalArguments":[""],"name":["AmazonCloudWatchAgent"]}' | convertfrom-json -ashashtable Send-SSMCommand -DocumentName 'AWS-ConfigureAWSPackage' -InstanceID $ID -Parameter $params -region us-west-2 The error is not that the variable is improperly formed , rather that 'Send-SSMCommand: document AWS-ConfigureAWSPackage does not support parameters. ' I think it is a bug with the cmdlet? Can you try running it? I am guessing that the cmdlet needs updated and is not passsing the flags correctly to the aws api for the document., the awscli must be correctly passing them.
2
answers
0
votes
19
views
asked 4 days ago

AWS 2dsphere limitation

Hi all, I am using DocumentDB with Mongo support on AWS and we are having documents that include geolocation. We have read the documentation of AWS for mongo support [here](https://docs.aws.amazon.com/documentdb/latest/developerguide/mongo-apis.html) but despite it says that it's supported we are receiving error during the creation of the index. The error we are getting when creating the index is : "*Command createIndexes failed : Index type not supported : 2dsphere*" The c# code that should generate the index is the below : ``` var prefixIndexName = nameof(Account.Address)+"."+nameof(Account.Address.Geolocation); if (!accountCollection.ExistIndex(prefixIndexName+"_2dsphere")) { Console.WriteLine("Seeding Geolocation Geo2DSphere Index ..."); var geoLocation = new StringFieldDefinition<Account>(prefixIndexName); var indexDefinition = new IndexKeysDefinitionBuilder<Account>().Geo2DSphere(geoLocation); var indexModel = new CreateIndexModel<Account>(indexDefinition, new CreateIndexOptions { Background = false }); accountCollection.CreateIndex(indexModel); } ``` The field that we are trying to add in the index is the "Address" and it looks like this : ``` "Address": { "CountryId": number, "PostCode": string, "AddressLine1": string, "AddressLine2": string, "City": string, "State": string, "Geolocation": { "type": "Point", "coordinates": decimal[] // e.g. [xx.xxxxxxx, xx.xxxxxxx] } } ``` The code is working on my local MongoDB installation, so I believe I am missing something to make it run on AWS. Any help you could provide is valuable, thanks in advance for your time!
0
answers
2
votes
13
views
asked 12 days ago

Creating EC2 Ingress rule in C#

I'm trying to create an ingress rule in C# and I'm getting an error at runtime. Here's the relevant code: ` ///////////BEGIN Set Vars////////////////////// /////////////////////////////////////////////// Amazon.EC2.AmazonEC2Client ec2Client = new Amazon.EC2.AmazonEC2Client(); Amazon.EC2.Model.AuthorizeSecurityGroupIngressRequest secRequest = new **Amazon.EC2.Model.AuthorizeSecurityGroupIngressRequest(); Amazon.EC2.Model.IpPermission ipPerm = new Amazon.EC2.Model.IpPermission(); Amazon.EC2.Model.IpRange ipRange = new Amazon.EC2.Model.IpRange(); List<Amazon.EC2.Model.IpPermission> ipRangeList = new List<Amazon.EC2.Model.IpPermission>(); /////////////////////////////////////////////// ///////////END Set Vars//////////////////////// /////////////////////////////////////////////// /////////////////////////////////////////////// ///////////BEGIN IP Range////////////////////// /////////////////////////////////////////////// ipRange.CidrIp = "5.5.5.10/32"; ipRange.Description = "My new IP rule"; ipRangeList.Add(ipPerm); /////////////////////////////////////////////// ///////////END IP Range//////////////////////// /////////////////////////////////////////////// /////////////////////////////////////////////// ///////////BEGIN IP Perms////////////////////// /////////////////////////////////////////////// ipPerm.IpProtocol = "tcp"; ipPerm.ToPort = 3389; ipPerm.FromPort = 3389; ipPerm.Ipv4Ranges.AddRange((IEnumerable<Amazon.EC2.Model.IpRange>)ipRangeList); /////////////////////////////////////////////// ///////////END IP Perms//////////////////////// ///////////////////////////////////////////////` If I just try to add ipRange as a range to *ipPerm*, the precompiler complains that it needs to be type of *List<Amazon.EC2.Model.IpPermission>*. When I use the code above and cast it to *List<Amazon.EC2.Model.IpPermission>*, the precompiler gets happy, but I get a runtime error: ** Message=Unable to cast object of type 'System.Collections.Generic.List`1[Amazon.EC2.Model.IpPermission]' to type 'System.Collections.Generic.IEnumerable`1[Amazon.EC2.Model.IpRange]'. Source=System.Private.CoreLib StackTrace: at System.Runtime.CompilerServices.CastHelpers.ChkCastAny(Void* toTypeHnd, Object obj) at AWSFirewall.Program.Main(String[] args) in C:\Users\SeanMcCown\source\repos\AWSFirewall\Program.cs:line 44**
1
answers
0
votes
29
views
asked 13 days ago

RequestParameters for Api Event in Serverless::Function in JSON - how does it work?

I'm trying to add some query string parameters for a Lambda function, using a SAM template written in JSON. All the examples are in YAML? Can anyone point out where I'm going wrong. Here's the snippet of the definition: ``` "AreaGet": { "Type": "AWS::Serverless::Function", "Properties": { "Handler": "SpeciesRecordLambda::SpeciesRecordLambda.Functions::AreaGet", "Runtime": "dotnet6", "CodeUri": "", "MemorySize": 256, "Timeout": 30, "Role": null, "Policies": [ "AWSLambdaBasicExecutionRole" ], "Events": { "AreaGet": { "Type": "Api", "Properties": { "Path": "/", "Method": "GET", "RequestParameters": [ "method.request.querystring.latlonl": { "Required": "true" }, "method.request.querystring.latlonr": { "Required": "true" } ] } } } } }, ``` and here's the error message I get: > Failed to create CloudFormation change set: Transform AWS::Serverless-2016-10-31 failed with: Invalid Serverless Application Specification document. Number of errors found: 1. Resource with id [AreaGet] is invalid. Event with id [AreaGet] is invalid. Invalid value for 'RequestParameters' property. Keys must be in the format 'method.request.[querystring|path|header].{value}', e.g 'method.request.header.Authorization'. Sorry I know this is a bit of a beginners question, but I'm a bit lost as to what to do, as I can't find any information about this using JSON. Maybe you can't do it using JSON? Thanks, Andy.
1
answers
0
votes
31
views
asked 15 days ago

AWS S3 Get object not working. Getting corrupt file

I have a .zip file on an S3 bucket. When I try to get it and save it as a file it is corrupt and can't be opened. I don't know what to do, is really hard to get that S3 zip file. I am using this code as a guide https://docs.aws.amazon.com/AmazonS3/latest/userguide/download-objects.html. I'm currently on Unity, but the SDK for .NET should work. The resulted file has the first 7.4 kb all NULL. This is the code that gets the zip file, returns the buffer, and saves it with File.WriteAllBytes: ``` /// <summary> /// Get Object from S3 Bucket /// </summary> public async Task<byte[]> GetZip(string pFile) { try { Debug.Log("KEY: " + pFile); GetObjectRequest request = new GetObjectRequest { BucketName = S3Bucket, Key = pFile }; using (GetObjectResponse response = await S3Client.GetObjectAsync(request)) using (Stream responseStream = response.ResponseStream) { Debug.Log("Response stream"); if (responseStream != null) { byte[] buffer = new byte[(int)response.ResponseStream.Length]; int result = await responseStream.ReadAsync(buffer, 0, (int)response.ResponseStream.Length); Debug.Log("Stream result: " + result); File.WriteAllBytes(songPath, buffer); Debug.Log("Readed all bytes: " + buffer.Length + " - " + ((int)response.ResponseStream.Length)); return buffer; } else { Debug.Log("Response is null"); } } } catch (AmazonS3Exception e) { // If bucket or object does not exist Debug.Log("Error encountered ***. Message:"+ e.Message + " when reading object"); } catch (Exception e) { Debug.Log("Unknown encountered on server. Message:"+ e.Message + " when reading object"); } return null; } ```
3
answers
1
votes
63
views
asked a month ago

S3 Get object not working properly in Unity

I am using AWS SDK .NET for Unity to download zip files from S3. I implemented the get method just as this tutorial for .NET https://docs.aws.amazon.com/AmazonS3/latest/userguide/download-objects.html But when I call the method with ReadObjectDataAsync().Wait(); Unity stops and crashes, like is in an infinite loop. This is my code, has a different name but is practically the same: /// <summary> /// Start is called before the first frame update /// </summary> void Start() { customSongsManager = gameObject.GetComponent<CustomSongsManager>(); GetZip(S3SampleFile).Wait(); } /// <summary> /// Get Object from S3 Bucket /// </summary> public async Task GetZip(string pFile) { string folder = "Assets/Audio/Custom/"; try { GetObjectRequest request = new GetObjectRequest { BucketName = S3Bucket, Key = pFile }; using (GetObjectResponse response = await S3Client.GetObjectAsync(request)) using (Stream responseStream = response.ResponseStream) { string title = response.Metadata["x-amz-meta-title"]; // Assume you have "title" as medata added to the object. string contentType = response.Headers["Content-Type"]; Debug.Log("Object metadata, Title: " + title); Debug.Log("Content type: " + contentType); if (responseStream != null) { using (BinaryReader bReader = new BinaryReader(response.ResponseStream)) { byte[] buffer = bReader.ReadBytes((int)response.ResponseStream.Length); File.WriteAllBytes(folder + S3SampleFile, buffer); Debug.Log("Writed all bytes"); StartCoroutine(customSongsManager.ReadDownloadedSong(folder + S3SampleFile)); } } } } catch (AmazonS3Exception e) { // If bucket or object does not exist Debug.Log("Error encountered ***. Message:"+ e.Message + " when reading object"); } catch (Exception e) { Debug.Log("Unknown encountered on server. Message:"+ e.Message + " when reading object"); } } The game crashes in this line: using (GetObjectResponse response = await S3Client.GetObjectAsync(request))
2
answers
0
votes
30
views
asked 2 months ago

Access S3 files from Unity for mobile development

I'm trying to configure the AWS S3 service to download the included files in a bucket using Unity for mobile. I downloaded the SDK package and I got it installed. From AWS console I set up a IAM policy and roles for unauth users I created a Cognito IdentityPool and got the relative id I set up the S3 bucket and its policy using the generator, including the **arn:aws:iam::{id}:role/{cognito unauth role}** and the resource **arn:aws:s3:::{bucket name}/***. In code I set credentials and region and create CognitoAWSCredentials (C# used) ```C# _credentials = new CognitoAWSCredentials(IdentityPoolId, _CognitoIdentityRegion); ``` then I create the client: ```C# _s3Client = new AmazonS3Client(_credentials, RegionEndpoint.EUCentral1); // the region is the same in _CognitoIdentityRegion ``` I then try to use the s3Client to get my files (in bucketname subfolders) ``` private void GetAWSObject(string S3BucketName, string folder, string sampleFileName, IAmazonS3 s3Client) { string message = string.Format("fetching {0} from bucket {1}", sampleFileName, S3BucketName); Debug.LogWarning(message); s3Client.GetObjectAsync(S3BucketName, folder + "/" + sampleFileName, (responseObj) => { var response = responseObj.Response; if (response.ResponseStream != null) { string path = Application.persistentDataPath + "/" + folder + "/" + sampleFileName; Debug.LogWarning("\nDownload path AWS: " + path); using (var fs = System.IO.File.Create(path)) { byte[] buffer = new byte[81920]; int count; while ((count = response.ResponseStream.Read(buffer, 0, buffer.Length)) != 0) fs.Write(buffer, 0, count); fs.Flush(); } } else { Debug.LogWarning("-----> response.ResponseStream is null"); } }); } ``` At this point I cannot debug into the Async method, I don't get any kind of error, I don't get any file downloaded and I even cannot check is connection to AWS S3 has worked in some part of the script. What am I doing wrong? Thanks for help a lot!
0
answers
0
votes
4
views
asked 3 months ago
1
answers
1
votes
15
views
asked 3 months ago

How can I do Distributed Transaction with EventBridge?

I'm using the following scenario to explain the problem. I have an ecommerce app which allows the customers to sign up and get an immediate coupon to use in the application. I want to use **EventBridge ** and a few other resources like a Microsoft SQL Database and Lambdas. The coupon is retrieved from a third-party API which exists outside of AWS. The event flow is: Customer --- *sends web form data* --> EventBridge Bus --> Lambda -- *create customer in SQL DB* --*get a coupon from third-party API* -- *sends customer created successfully event* --> EventBridge Bus Creating a customer in SQL DB, getting the coupon from the third-party API should happen in a single transaction. There is a good chance that either of that can fail due to network error or whatever information that the customer provides. Even if the customer has provided the correct data and a new customer is created in the SQL DB, the third-party API call can fail. These two operations should succeed only if both succeed. Does EventBridge provide distributed transaction through its .NET SDK? In the above example, if the third-party call fails, the data created in the SQL database for the customer is rolled back as well as the message is sent back to the queue so it can be tried again later. I'm looking for something similar to [TransactionScope](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/servicebus/Azure.Messaging.ServiceBus/samples/Sample06_Transactions.md) that is available in Azure. If that is not available, how can I achieve distributed transaction with EventBridge, other AWS resources and third-party services which have a greater chance of failure as a unit.
3
answers
0
votes
29
views
asked 3 months ago

Elastic Beanstalk | .Net with Docker containg custon nginx.conf

Current Setup: Elastic Beanstalk running Docker running on 64bit Amazon Linux 2/3.4.11. I was trying to follow AWS Guidelines for overwrite nginx.conf file located in /etc/nginx/nginx.conf without any success. I have .NET 5 project containing the .platform/nginx/nginx.conf (also trying .ebextenstion). When I'm building my dockerfile and deploying to ECR, adding dockerrun.aws.json to pull the latest image its not taking my custom nginx.conf. nginx.file: ``` user nginx; worker_processes auto; error_log /var/log/nginx/error.log; pid/var/run/nginx.pid; worker_rlimit_nofile 8192; events { worker_connections 4096; } http { include /etc/nginx/mime.types; default_type application/octet-stream; access_log /var/log/nginx/access.log; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; include conf.d/*.conf; map $http_upgrade $connection_upgrade { default "upgrade"; } server { listen 80 default_server; gzip on; gzip_comp_level 4; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; access_log /var/log/nginx/access.log main; location / { proxy_pass http://docker; proxy_http_version 1.1; proxy_set_header Connection$connection_upgrade; proxy_set_header Upgrade$http_upgrade; proxy_set_header Host$host; proxy_set_header X-Real-IP$remote_addr; proxy_set_header X-Forwarded-For$proxy_add_x_forwarded_for; } # Include the Elastic Beanstalk generated locations include conf.d/elasticbeanstalk/*.conf; } } ``` I would like to know how can I fix it and replace the nginx default file. Thanks!
1
answers
0
votes
83
views
asked 3 months ago
  • 1
  • 90 / page