Remove n lines from JSON when uploaded with lambda

I have a lambda function set up with api gateway.when I upload a json file I get these 3 lines on top when I check the file in S3. how can i remove this.

I tried using diffrent variations with split() but I keep removing the wrong lines

Code:

async function process(requestBody){
    const fileName = requestBody;
    let fileContent = requestBody;
    
    const params = {
        Bucket:bucketName,
        Key: `${fileName}`,
        Body:fileContent
    }
    
    await s3.putObject(params).promise();
    return util.buildReponse(200);
}
------WebKitFormBoundary5xOkojIBpXqoCtdp
Content-Disposition: form-data; name="demo file"; filename="simple.json"
Content-Type: application/json

[
    {
      "id": "5ac6be6b-8064-4159-9de6-89178a9f8a54",
      "firstname": "Matt",
      "lastname":"Hansen"
    },
    {
      "id": "4b912da3-66aa-45c1-a87f-9f79a256e570",
      "firstname": "Brad",
      "lastname":"Lunsford"
    }
]

Hi @kylegilmartin10

I can’t tell you if removing this data is a good idea or not and suspect you would need to set the correct URI request parameters here Someone more informed would need to comment on this.

What I can tell you is what you are trying to achieve with a string output and split() is possible. One way is to use a regex with capture
e.g.
/(\[)/ <- will capture the escaped opening bracket inside the parentheses

console.log(
  'unwanted stuff here [{"someArray": ["string1", "string2"]}]'.split(/(\[)/)
)
// Array output
// ['unwanted stuff here ', '[', '{"someArray": ', '[', '"string1", "string2"]}]']

Note: unlike with a standard string e.g. split('[') the bracket separator isn’t removed.

You can then use Array.slice(1) to copy everything but the first index of the array e.g. not 'unwanted stuff here ’ and then join together again.

console.log(
  'unwanted stuff here [{"someArray": ["string1", "string2"]}]'
  .split(/(\[)/)
  .slice(1)
  .join('')
)
// String output
// [{"someArray": ["string1", "string2"]}]

I am not convinced the above is the solution though.

1 Like

What I am seeing in various posts on uploading json to AWS which appears to be missing from your script is content-type e.g.

const params = {
    Bucket:bucketName,
    Key: `${fileName}`,
    Body:fileContent,
    ContentType: 'application/json' // <--
}

Not sure if this is useful or not

Thanks, that fixed it. the reason I needed the top 3 lines removed because I have another lambda function that has a trigger on that S3 bucket which grabs the json and splits it. so it can go into DynamoDB as separate entries

1 Like

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.