我正在尝试创建Firehose流,该流可以通过AWS Lambda从账户A中的不同区域接收数据,并输出到账户B中的redshift表中.为此,我在账户A上创建了一个IAM角色:
I am trying to create Firehose streams that can receive data from different regions in Account A, through AWS Lambda, and output into a redshift table in Account B. To do this I created an IAM role on Account A:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "Service": "firehose.amazonaws" }, "Action": "sts:AssumeRole" } ] }我授予了以下权限:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Action": [ "s3:AbortMultipartUpload", "s3:GetBucketLocation", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::b-bucket/*", "arn:aws:s3:::b-bucket" ] }, { "Effect": "Allow", "Action": [ "firehose:*" ], "Resource": "*" }, { "Effect": "Allow", "Action": [ "redshift:*" ], "Resource": "*" } ] }在帐户B上,我使用此信任策略创建了一个角色:
On Account B I created a role with this trust policy:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "Service": "firehose.amazonaws" }, "Action": "sts:AssumeRole", "Condition": { "StringEquals": { "sts:ExternalId": "11111111111" } } } ] }我授予该角色以下访问权限:
I gave that role the following access:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:*" ], "Resource": [ "arn:aws:s3:::b-bucket", "arn:aws:s3:::b-bucket/*", "arn:aws:s3:::b-account-logs", "arn:aws:s3:::b-account-logs/*" ] }, { "Effect": "Allow", "Action": [ "firehose:*" ], "Resource": "*" }, { "Effect": "Allow", "Action": "redshift:*", "Resource": "arn:aws:redshift:us-east-1:cluster:account-b-cluster*" } ] }我还编辑了S3存储桶上的访问策略,以授予对我的帐户A角色的访问权限:
I also edited the access policy on the S3 buckets to give access to my Account A role:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::11111111111:role/AccountAXAccountBPolicy" }, "Action": "s3:*", "Resource": ["arn:aws:s3:::b-bucket","arn:aws:s3:::b-bucket/*"] } ] }但是,这些都不起作用.当我尝试在帐户A中创建流时,它没有列出帐户B中的存储区,也没有列出redshift群集.有什么办法可以做到这一点?
However, none of this works. When I try to create the the stream in Account A it does not list the buckets in Account B nor the redshift cluster. Is there any way to make this work?
推荐答案John的回答是半正确的.我建议Redshift群集的帐户所有者创建FireHose流.通过CLI创建需要您提供用户名和密码.让群集所有者创建流并在流上共享IAM角色权限对于安全性和凭据更改而言更加安全.此外,您无法创建访问该区域之外的数据库的流,因此传递应用程序可以访问正确的流和区域.
John's answer is semi correct. I would recommend that the account owner of the Redshift Cluster creates the FireHose Stream. Creating through CLI requires you to supply the user name and password. Having the cluster owner create the stream and sharing IAM Role permissions on the stream is safer for security and in case of credential change. Additionally, you cannot create a stream that accesses a database outside of the region, so have the delivery application access the correct stream and region.
继续阅读下面的内容,以了解如何创建交叉账户流.
Read on to below to see how to create the cross account stream.
就我而言,我都可以访问两个帐户,并且为了减少更改量和简化监控,我在帐户A端创建了流.
In my case both accounts are accessible to me and to lower the amount of changes and ease of monitoring I created the stream on Account A side.
以上权限是正确的,但是您无法通过AWS控制台创建从账户A到账户B的Firehose流.您需要通过AWS Cli做到这一点:
The above permissions are right however, you cannot create a Firehose Stream from Account A to Account B through AWS Console. You need to do it through AWS Cli:
aws firehose create-delivery-stream --delivery-stream-name testFirehoseStreamToRedshift --redshift-destination-configuration 'RoleARN="arn:aws:iam::11111111111:role/AccountAXAccountBRole", ClusterJDBCURL="jdbc:redshift://<cluster-url>:<cluster-port>/<>", CopyCommand={DataTableName="<schema_name>.x_test",DataTableColumns="ID1,STRING_DATA1",CopyOptions="csv"},Username="<Cluster_User_name>",Password="<Cluster_Password>",S3Configuration={RoleARN="arn:aws:iam::11111111111:role/AccountAXAccountBRole", BucketARN="arn:aws:s3:::b-bucket",Prefix="test/",CompressionFormat="UNCOMPRESSED"}'您可以通过在另一个AWS账户上创建测试表来对此进行测试:
You can test this by creating a test table on the other AWS Account:
create table test_schema.x_test ( ID1 INT8 NOT NULL, STRING_DATA1 VARCHAR(10) NOT NULL ) distkey(ID1) sortkey(ID1,STRING_DATA1);您可以像这样发送测试数据:
You can send test data like this:
aws firehose put-record --delivery-stream-name testFirehoseStreamToRedshift --record '{"DATA":"1,\"ABCDEFGHIJ\""}'这具有上面的权限配置,应该为您创建交叉帐户访问.
This with the permissions configuration above should create the cross account access for you.
文档: 创建流- docs.aws .amazon/cli/latest/reference/firehose/create-delivery-stream.html
Documentation: Create Stream - docs.aws.amazon/cli/latest/reference/firehose/create-delivery-stream.html
放置记录- docs. aws.amazon/cli/latest/reference/firehose/put-record.html
更多推荐
AWS Firehose跨区域/账户政策
发布评论