I'm encountering a very similar issue , the plugin is installed and the IAM role is correctly configured, but when I attempt to submit a SessionJob, I receive the following error in the Kubernetes Operator.
I have flink-deployment spec like below
flinkConfiguration:
fs.s3.impl: org.apache.hadoop.fs.s3a.S3AFileSystem
fs.s3a.aws.credentials.provider: com.amazonaws.auth.WebIdentityTokenCredentialsProvider
Caused by: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by DynamicTemporaryAWSCredentialsProvider TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider Environ │
│ at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:216) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715) │
│ at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697) │
│ at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561) │
│ at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541) │
│ at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456) │
│ at com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:6432) │
│ at com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:6404) │
│ at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5441) │
│ at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403) │
│ at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1372) │
│ at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$10(S3AFileSystem.java:2545) │
│ at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414) │
│ at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377) │
│ at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2533) │
│ at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2513) │
│ at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3776) │
│ ... 29 more │
│ Caused by: com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)) │
│ at com.amazonaws.auth.EnvironmentVariableCredentialsProvider.getCredentials(EnvironmentVariableCredentialsProvider.java:49) │
│ at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:177)
Does anyone have suggestions on how to identify and resolve this issue using Flink 1.17?