Please Signin

Welcome to Arun's Blog , By signing in, you'll enjoy a seamless journey and save your preferences

AWS S3 File Upload With NodeJs and React

An extensive guide to upload files and retrieve the public url after upload, update and delete a file from AWS S3 Bucket.

AWS S3 File Upload With NodeJs and React

Table Of Content

Before i start i'm assuming that you have an active aws account and Node JS installed in the local machine.

Configure Aws Bucket

Go to aws and search for 's3' and click on Create bucket.

 Create Bucket
 Create Bucket

 Create Bucket

Allow Public Access
Allow Public Access

Allow Public Access

Click on the bucket and go to permission tab. There scroll a bit under Bucket policy click on Edit and the paste the below code. ( Change the 'bucket_name' as per the name you have given while creating )

1   {

2   "Version": "2012-10-17",

3   "Statement": [

4   {

5   "Sid": "AddPerm",

6   "Effect": "Allow",

7   "Principal": "*",

8   "Action": "s3:GetObject",

9   "Resource": "arn:aws:s3:::bucket_name/*"

10   }

11   ]

12   }

Bucket Policy
Bucket Policy

Bucket Policy

Generate AWS Credentials

Go to aws search for 'I Am' and Click on Create user and follow the on screen steps.

Create Aws IAM user
Create Aws IAM user

Create Aws IAM user

Setting Permissions
Setting Permissions

Setting Permissions

Click on create group and click on Administrator Access to have full access of all aws services.

Providing full access
Providing full access

Providing full access

Now go to users and click on create access key and follow the on screen process. Save both the access key and secret key somewhere.

AWS IAM Access key and Secret Key
AWS IAM Access key and Secret Key

AWS IAM Access key and Secret Key

Initialize a Node js Project

Lets create a node js application to build our enpoints

1   npm init -y

1   // Install necessary libraries

2  

3   npm i express cors multer dotenv uuid @aws-sdk/client-s3

Create a .env file

1   # Aws S3

2   AWS_REGION=ap-south-1

3   AWS_BUCKET_NAME=your_bucket_name

4   AWS_BASE_URL=https://your_bucket_name.s3.ap-south-1.amazonaws.com

5  

6   #Aws I AM Credentials

7   AWS_ACCESS_KEY=your_access_key

8   AWS_SECRET_KEY=your_secret_access_key

9  

1   //index.js

2  

3   import express from 'express';

4   import cors from 'cors';

5  

6   import { storageRouter } from './routes/storage.route.js';

7  

8   const server = express();

9  

10   server.use(cors());

11   server.use(express.json());

12   server.use(express.urlencoded({ extended: true }));

13  

14   server.use('/api/storage', storageRouter);

15  

16   server.listen(8080, () => {

17   console.log('Server listening ...');

18   })

1   //storage.route.js

2  

3   import { Router } from "express";

4   import { pushObject, updateObject, removeObject } from "../controllers/storage.controller.js";

5   import { multer_parser } from "../configs/multer.config.js";

6  

7   const storageRouter = Router();

8  

9   storageRouter.post('/:folder', multer_parser.single('file'), pushObject);

10   storageRouter.patch('/', multer_parser.single('file'), updateObject);

11   storageRouter.delete('/', multer_parser.single('file'), removeObject);

12  

13   export { storageRouter }

1   //multer.config.js

2  

3   import multer from "multer";

4  

5   const storage = multer.memoryStorage();

6   const multer_parser = multer({ storage });

7  

8   export { multer_parser }

1   //s3.config.js

2  

3   import env from 'dotenv'

4   import { S3Client } from "@aws-sdk/client-s3";

5  

6   env.config();

7  

8   const s3 = new S3Client({

9   region: process.env.AWS_REGION,

10   credentials: {

11   accessKeyId: process.env.AWS_ACCESS_KEY,

12   secretAccessKey: process.env.AWS_SECRET_KEY

13   }

14   })

15  

16   export { s3 }

1   // storage.controller.js

2  

3   import { s3 } from '../configs/s3.config.js';

4   import { v4 as uuid } from 'uuid';

5   import { PutObjectCommand, DeleteObjectCommand } from '@aws-sdk/client-s3';

6  

7   const pushObject = async (req, res) => {

8  

9   const { file } = req;

10   const { folder } = req.params;

11  

12   if (!file) {

13   return res.status(404).send({ message: 'File not found' });

14   }

15  

16   const extention = file.originalname.split(".").pop();

17  

18   const params = {

19   Bucket: process.env.AWS_BUCKET_NAME,

20   Key: `${folder}/${uuid()}.${extention}`.split("-").join(""),

21   Body: file.buffer,

22   ContentType: file.mimetype,

23   };

24  

25   const putObjectCommand = new PutObjectCommand(params);

26  

27   try {

28   await s3.send(putObjectCommand);

29   const url = `${process.env.AWS_BASE_URL}/${params.Key}`;

30   return res.send({ message: 'File Uploaded', url });

31   } catch (error) {

32   return res.status(500).send({ message: 'Internal Server Error' });

33   }

34   }

35  

36   const updateObject = async (req, res) => {

37  

38   const { file } = req;

39   const { url } = req.body;

40  

41   if (!file) {

42   return res.status(404).send({ message: 'File not found' });

43   }

44  

45   if (!url) {

46   return res.status(404).send({ message: 'Url not found' });

47   }

48  

49   const { pathname } = new URL(url);

50  

51   const params = {

52   Bucket: process.env.AWS_BUCKET_NAME,

53   Key: pathname.substring(1),

54   Body: file.buffer,

55   ContentType: file.mimetype,

56   };

57  

58   const putObjectCommand = new PutObjectCommand(params);

59  

60   try {

61   await s3.send(putObjectCommand);

62   const url = `${process.env.AWS_BASE_URL}/${params.Key}`

63   return res.send({ message: 'File Updated', url });

64   } catch (error) {

65   return res.status(500).send({ message: 'Internal Server Error' });

66   }

67   }

68  

69   const removeObject = async (req, res) => {

70  

71   const { url } = req.body;

72  

73   if (!url) {

74   return res.status(404).send({ message: 'Url not found' });

75   }

76  

77   const { pathname } = new URL(url);

78  

79   const params = {

80   Bucket: process.env.AWS_BUCKET_NAME,

81   Key: pathname.substring(1), //removing the first '/' from the pathname

82   }

83  

84   const deleteObejctCommand = new DeleteObjectCommand(params);

85  

86   try {

87   await s3.send(deleteObejctCommand);

88   return res.send({ message: 'File Removed', url });

89   } catch (error) {

90   return res.status(500).send({ message: 'Internal Server Error' });

91   }

92   }

93  

94   export { pushObject, updateObject, removeObject }

Now we have build all the endpoints to upload, update and delete a file.

1   // endpoints

2  

3   POST /api/storage/:folder //folder name can be anything in which the file will be stored

4  

5   PATCH /api/storage

6   //body should contain formdata { url: url of the file to update, file: new file to update }

7  

8   DELETE /api/storege

9   //body should contain formdata { url: url of the file to update}

Connect with a Frontend Application

Let hit out out endpoint with a frontend application build using react where we can upload and update the files.

1   //App.js

2  

3   import { Button, Flex, Heading, Image, Input, Stack, VStack } from "@chakra-ui/react";

4   import { useState } from "react";

5   import { toast } from "react-hot-toast";

6  

7   function App() {

8  

9   const [isUploading, setIsUploading] = useState(false);

10   const [isUpdating, setIsUpdating] = useState(false);

11   const [publicUrl, setPublicUrl] = useState('');

12  

13   const upload = async (e) => {

14   e.preventDefault();

15  

16   const formData = new FormData(e.target);

17  

18   setIsUploading(true);

19  

20   try {

21   let response = await fetch('http://localhost:8080/api/storage/test_folder', {

22   method: 'POST',

23   body: formData

24   })

25  

26   let { message, url } = await response.json();

27  

28   e.target.reset();

29  

30   if (response.ok) {

31   setPublicUrl(url);

32   toast.success(message);

33   } else {

34   toast.error(message);

35   }

36  

37   } catch (error) {

38   toast.error(error.message);

39   }

40  

41   setIsUploading(false);

42   }

43  

44   const update = async (e) => {

45   e.preventDefault();

46  

47   const formData = new FormData(e.target);

48  

49   setIsUpdating(true);

50  

51   try {

52   let response = await fetch('http://localhost:8080/api/storage', {

53   method: 'PATCH',

54   body: formData

55   })

56  

57   let { message } = await response.json();

58  

59   if (response.ok) {

60   toast.success(message);

61   } else {

62   toast.error(message);

63   }

64  

65   } catch (error) {

66   toast.error(error.message);

67   }

68  

69   setIsUpdating(false);

70   }

71  

72   const copyUrl = () => {

73   navigator.clipboard.writeText(publicUrl).then(() => {

74   toast.success('Copied');

75   }).catch((_) => {

76   toast.error('Unable to copy');

77   });

78   }

79  

80   return (

81   <Flex p={6} pt={'10vh'} gap={6} h={'100vh'} w={'100%'} direction={['column', 'column', 'row', 'row']}>

82   <VStack w={'100%'} >

83   {publicUrl ? (

84   <Stack>

85   <Image src={publicUrl} alt="Aws S3 url" h={300} />

86   <Button onClick={copyUrl}> Copy Url </Button>

87   <Button onClick={() => setPublicUrl('')}> Upload More </Button>

88   </Stack>

89   ) : (

90   <form onSubmit={upload}>

91   <Heading my={4}> Upload File </Heading>

92   <Input name="file" type="file" required />

93   <Button type="submit" my={4} isLoading={isUploading}> Upload </Button>

94   </form>

95   )}

96   </VStack>

97  

98   <VStack w={'100%'}>

99   <form onSubmit={update}>

100   <Heading my={4}> Update File </Heading>

101   <Input name="url" type="url" placeholder="Enter Url to update" required />

102   <Input name="file" type="file" my={4} required />

103   <Button type="submit" isLoading={isUpdating}> Update </Button>

104   </form>

105   </VStack>

106   </Flex>

107   );

108   }

109  

110   export default App;

111  

React Frontend to test aws file upload
React Frontend to test aws file upload

React Frontend to test aws file upload

Conclusion

In the entire blog i walk through how to upload, update and delete a file in aws s3 bucket in many steps. During the steps we have started by creating an aws bucket and set its permission to make it publicly accessible. 

We have also user a aws service call 'IAM'. This is where we have created a user that has all the access in aws services and we have used its access key and secret key to programmatically authenticate to aws s3 bucket to do various operations.

While uploading a file we are user a dynamic param call 'folder', through that we can control in which folder the file should be uploaded.

Multer also being used for all the router to parse the file and fomData coming from frontend.

React Frontend also being build and used to test out the endpoints.

Source Code

check here

Thanks for joining me through out journey😊.

* * *