[HTML payload içeriği buraya]
31.2 C
Jakarta
Sunday, May 17, 2026

Automate replication of row-level safety from AWS Lake Formation to Amazon QuickSight


Amazon QuickSight is cloud-powered, serverless, and embeddable enterprise intelligence (BI) service that makes it simple to ship insights to your group. As a totally managed service, Amazon QuickSight permits you to create and publish interactive dashboards that may then be accessed from totally different units and embedded into your purposes, portals, and web sites.

When authors create datasets, construct dashboards, and share with end-users, the customers will see the identical knowledge because the writer, until row-level safety (RLS) is enabled within the Amazon QuickSight dataset. Amazon QuickSight additionally gives choices to cross a reader’s identification to a knowledge supply utilizing trusted identification propagation and apply RLS on the supply. To be taught extra, see Centrally handle permissions for tables and views accessed from Amazon QuickSight with trusted identification propagation and Simplify entry administration with Amazon Redshift and AWS Lake Formation for customers in an Exterior Id Supplier.

Nonetheless, there are a number of necessities when utilizing trusted identification propagation with Amazon QuickSight:

  • The authentication methodology for Amazon QuickSight have to be utilizing AWS IAM Id Middle.
  • The dataset created utilizing trusted identification propagation might be a direct question dataset in Amazon QuickSight. QuickSight SPICE can’t be used with trusted identification propagation. It’s because when utilizing SPICE, knowledge is imported (replicated) and subsequently the entitlements on the supply can’t be used when readers entry the dashboard.

This publish outlines an answer to mechanically replicate the entitlements for readers from the supply (AWS Lake Formation) to Amazon QuickSight. This answer can be utilized even when the authentication methodology in Amazon QuickSight just isn’t utilizing IAM Id Middle and might work with each direct question and SPICE datasets in Amazon QuickSight. This allows you to make the most of auto scaling that comes with SPICE. Though we concentrate on utilizing a Lake Formation desk that exists in the identical account, you’ll be able to prolong the answer for cross-account tables as nicely. When extracting knowledge filter guidelines for the desk in one other account, the execution function will need to have vital entry to the tables within the different account.

Use case overview

For this publish, let’s take into account a big monetary establishment that has applied Lake Formation as its central knowledge lake and entitlement administration system. The establishment goals to streamline entry management and preserve a single supply of fact for knowledge permissions throughout its total knowledge ecosystem. Through the use of Lake Formation for entitlement administration, the monetary establishment can preserve a strong, scalable, and compliant knowledge entry management system that serves as the muse for its data-driven operations and analytics initiatives. This method is especially essential for sustaining compliance with monetary rules and sustaining knowledge safety. The analytics workforce needs to construct an Amazon QuickSight dashboard for knowledge and enterprise groups.

Resolution overview

This answer makes use of APIs of AWS Lake Formation and Amazon QuickSight to extract, rework, and retailer AWS Lake Formation knowledge filters in a format that can be utilized in QuickSight.

The answer has 4 key steps:

  1. Extract and rework the row-level safety (knowledge filters) and permissions to knowledge filters for tables of curiosity from AWS Lake Formation.
  2. Create a guidelines dataset in Amazon QuickSight.

We use the next key providers:

The next diagram illustrates the answer structure.

Conditions

To implement this answer, you need to have following providers enabled in the identical account

  1. AWS Lake Formation and
  2. Amazon QuickSight
  3. AWS Id and Entry Administration (IAM) permissions: Be sure to have vital IAM permissions to carry out operation throughout all of the providers talked about within the answer overview above
  4. AWS Lake Formation desk with knowledge filters with proper permissions
  5. Amazon QuickSight principals (Customers or Teams)

The beneath part reveals how one can create Amazon QuickSight teams and AWS Lake formation tables and knowledge filters

Create teams in QuickSight

Create two teams in Amazon QuickSight: QuickSight_Readers and QuickSight_Authors. For directions, see Create a gaggle with the QuickSight console.

You may then kind the Amazon Useful resource Names (ARNs) of the teams as follows. These might be used when granting permission in AWS Lake Formation for knowledge filters.

  • arn:aws:quicksight:<<identity-region>>:<<AWSAcocuntId>>:group/<<namespace>>/QuickSight_Readers
  • arn:aws:quicksight:<<identity-region>>:<<AWSAcocuntId>>:group/<<namespace>>/QuickSight_Authors

You may also get the ARN of the teams by executing the Amazon QuickSight CLI command list-groups. The next screenshot reveals the output.

Create a desk in AWS Lake Formation

The next part is for instance functions and never vital for manufacturing use of this answer. Full the next steps to create a desk in AWS Lake Formation utilizing pattern knowledge. On this publish, the desk is named saas_sales.

  1. Obtain the file Saas Gross sales.csv.
  2. Add the file to an Amazon S3 location.
  3. Create a desk in AWS Lake Formation.

Create row-level safety (knowledge filter) in AWS Lake Formation

In AWS Lake Formation, knowledge filters are used to filter the info in a desk for a person or group. Full the next steps to create a knowledge filter:

  1. Create a knowledge filter known as QuickSightReaderFilter within the desk saas_sales. For Row-level entry, enter the expression phase="Enterprise".
  2. Grant the Amazon QuickSight group entry to this knowledge filter. Use the reader group ARN from step one for SAML Customers and teams.
  3. Grant the QuickSight_Authors group full entry to the desk. Use the reader group ARN from step one for SAML Customers and teams.
  4. (Elective) You may create one other desk known as second_table and create one other knowledge filter known as SecondFilter and grant permission to the QuickSight_Readers group.

Now that you’ve got arrange the desk, permissions, and knowledge filters, you’ll be able to extract the row-level entry particulars for the QuickSight_Readers and QuickSight_Authors teams and the saas_sales desk in AWS Lake Formation, and create the foundations dataset in Amazon QuickSight for the saas_sales desk.

Extract and rework knowledge filters and permissions from AWS Lake Formation utilizing a Lambda perform

In AWS Lake Formation, knowledge filters are created for every desk. There will be many tables in AWS Lake Formation. Nonetheless, for a workforce or a mission, there are solely a particular set of tables that the BI developer is focused on. Subsequently, select an inventory of tables to trace and replace the info filters for. In a batch course of, for every desk in AWS Lake Formation, extract the info filter definitions and write them into Amazon S3 utilizing AWS Lake Formation and Amazon S3 APIs.

We use the next AWS Lake Formation APIs to extract the info filter particulars and permissions:

  • ListDataCellFilters – This API is used to listing all the info filters in every desk that’s required for the mission
  • ListPermissions – This API is used to retrieve the permissions for every of the info filters extracted utilizing the ListDataCellFilters API

The Lambda perform covers three components of the answer:

  • Extract the info filters and permissions to knowledge filters for tables of curiosity from AWS Lake Formation
  • Rework the info filters and permission right into a format usable in Amazon QuickSight
  • Persist the remodeled knowledge

Full the next steps to create an AWS Lambda perform:

  1. On the Lambda console, create a perform known as Lake_Formation_QuickSight_RLS. Use Python 3.12 because the runtime and create a brand new function for execution.
  2. Configure Lambda perform timeout to 2 minutes. This could fluctuate relying on the variety of tables to be parsed and the variety of knowledge filters to be remodeled.
  3. Connect the next permissions to the Lambda execution function:
    {
    "Model": "2012-10-17",
    "Assertion": [
    {
    "Sid": "VisualEditor0",
    "Effect": "Allow",
    "Action": [
    "lakeformation:ListDataCellsFilter",
    "lakeformation:ListPermissions"
    ],
    "Useful resource": "*"
    },
    {
    "Sid": "VisualEditor1",
    "Impact": "Permit",
    "Motion": "s3:PutObject",
    "Useful resource": "arn:aws:s3:::<bucket_used_for_storage>/*"
    }
    ]
    }

  4. Set the next surroundings variables for the Lambda perform:
    TitleWorth
    S3BucketWorth of the S3 bucket the place the output recordsdata might be saved
    tablesToTrackChecklist of tables to trace as JSON transformed to string
    Tmp/tmp

The Lambda perform will get the listing of tables and S3 bucket particulars from the surroundings variables. The listing of tables is given as a JSON array transformed to string. The JSON format is proven within the following code. The values for catalogId, DatabaseName, and Title will be fetched from the AWS Lake Formation console.

[
{
"CatalogId": "String",
"DatabaseName": "String",
"Name": "String"
}
]

  1. Add a folder named tmp.
  2. Obtain the zip file Lake_Formation_QuickSight_RLS.zip.
    Word: That is pattern code for non-production utilization. It’s best to work together with your safety and authorized groups to fulfill your organizational safety, regulatory, and compliance necessities earlier than deployment.
  3. For the Lambda perform code, add the downloaded .zip file to the Lambda perform, on the Code tab.
  4. Present vital entry to the execution function in AWS Lake Formation. Though the AWS Id and Entry Administration (IAM) permissions are given to the Lambda execution function, specific permission needs to be given to the function in AWS Lake Formation for the Lambda perform to get the small print in regards to the knowledge filters. Subsequently, it’s a must to explicitly grant entry to the execution function to restrict the Lambda function to read-only admin. For extra particulars, see Viewing knowledge filters.

Within the following sections, we clarify what the Lambda perform code does in additional element.

Extract knowledge filters and permissions for knowledge filters and tables in AWS Lake Formation

The primary move of the code takes the listing of tables as enter and extracts desk and knowledge filter permissions and knowledge filter guidelines. The method right here is to get the permissions for your entire desk and likewise for the info filters utilized to the desk. This manner, each full entry (desk degree) and partial entry (knowledge filter) will be extracted.

...
....
tablesToTrack= json.hundreds(os.environ["tablesToTrack"])
lf_client = boto3.consumer('lakeformation')
# For every desk within the listing get the info filter guidelines hooked up to the desk.
for desk in tablesToTrack:
df_response= lf_client.list_data_cells_filter(
Desk= desk
)
d_filters += df_response["DataCellsFilters"]

# Additionally, for every desk within the listing get the listing of permissions at desk degree.
# This determines who has entry to all rows within the desk.
tresponse=lf_client.list_permissions(
Useful resource= {
"Desk": desk
}
)

d_permissions += tresponse["PrincipalResourcePermissions"]
transformDataFilterRules(d_filters)
# For every knowledge filters fetched above, get the permissions.
# This determines the row degree safety for the tables.
for filter in d_filters:
p_response=lf_client.list_permissions(
Useful resource= {

"DataCellsFilter": {
"DatabaseName": filter ["DatabaseName"],
"Title": filter["Name"],
"TableCatalogId": filter["TableCatalogId"],
"TableName": filter["TableName"]
}

}
)
d_permissions += p_response["PrincipalResourcePermissions"]

transformFilterandTablePermissions(d_permissions)

Rework knowledge filter definitions in to a format usable in Amazon QuickSight

The extracted permissions and filters are remodeled to create a guidelines dataset in Amazon QuickSight. There are alternative ways to outline knowledge filters. The next determine illustrates a number of the instance transformations.

The perform transformDataFilterRules within the following code can rework a number of the OR and AND circumstances into Amazon QuickSight acceptable format. The next are the small print out there within the remodeled format:

  • Lake Formation catalog ID
  • Lake Formation database title
  • Lake Formation desk title
  • Lake Formation knowledge filter title
  • Checklist of columns from all of the tables offered within the enter for which the info filter guidelines are outlined

See the next code:

def transformDataFilterRules(guidelines):
world complete_transformed_filter_rules
transformed_filter_rules = []
filter_to_extract=[]
complete_transformed_filter_rules = []
col_headers=[]
col_headers.append("catalog")
col_headers.append("database")
col_headers.append("desk")
col_headers.append("filter")

for rule in guidelines:
print(rule)
catalog=rule["TableCatalogId"]
database = rule["DatabaseName"]
desk = rule["TableName"]
filter = rule["Name"]
row=[]
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
logger.information(f"row==={row}")

f_conditions = re.cut up(' OR | or | and | AND ' , rule["RowFilter"]["FilterExpression"])

for f_condition in f_conditions:
logger.information(f"f_condition={f_condition}")
f_condition = f_condition.change("(","")
f_condition = f_condition.change(")","")
filter_rule_column= f_condition.cut up("=")
if len(filter_rule_column)>1:
filter_rule_column[0] = filter_rule_column[0].strip()
if not filter_rule_column[0].strip() in col_headers:
col_headers.append(filter_rule_column[0].strip())
i= col_headers.index(filter_rule_column[0].strip())
j= i- (len(row)-1)
if j>0:
for x in vary(1, j):
row.append("")
logger.information(f"i={i} j={j} {filter_rule_column[1]}")
row.insert(i, filter_rule_column[1].change("'",""))
print(row)
transformed_filter_rules.append(','.be a part of(row))

row=[]
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
max_columns = len(col_headers)
complete_transformed_filter_rules=[]
for rule in transformed_filter_rules:
r = rule.cut up(",")
to_fill = max_columns - len(r)
if to_fill>0:
for x in vary(1, to_fill+1):
r.append("")
complete_transformed_filter_rules.append(','.be a part of(r))

complete_transformed_filter_rules.insert(0,','.be a part of(col_headers))

The next determine is an instance of the remodeled file. The file accommodates the columns for each tables. When making a guidelines dataset for a particular desk, the information are filtered for that desk pulled into Amazon QuickSight.

The perform transformFilterandTablePermissions within the following code snippet combines and transforms the desk and knowledge filter permissions right into a flat construction that accommodates the next columns:

  • Amazon QuickSight group ARN
  • Lake Formation catalog ID
  • Lake Formation database title
  • Lake Formation desk title
  • Lake Formation knowledge filter title

See the next code:

def transformFilterandTablePermissions(permissions):
    world transformed_table_permissions,transformed_filter_permissions
    # Learn and set desk degree entry
    transformed_table_permissions = []
    transformed_filter_permissions = []
    transformed_filter_permissions.insert(0,"group,catalog,database,desk,filter")
    transformed_table_permissions.insert(0,"group,catalog,database,desk")
    
    for permission in permissions:
    group=""
    database=""
    desk =""
    catalog=""
    
    p= permission["Permissions"]
    
    if "DESCRIBE" in p or "SELECT" in p:
    
    group = permission["Principal"]["DataLakePrincipalIdentifier"]
    if "Database" in permission["Resource"]:
    catalog=permission["Resource"]["Database"]["CatalogId"]
    database=permission["Resource"]["Database"]["Name"]
    desk = "*"
    transformed_table_permissions.append(group + "," + catalog+ "," + database + "," + desk)
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk)
    elif "TableWithColumns" in  permission["Resource"]  or "Desk" in permission["Resource"]:
    if "TableWithColumns" in  permission["Resource"]:
    catalog=permission["Resource"]["TableWithColumns"]["CatalogId"]
    database = permission["Resource"]["TableWithColumns"]["DatabaseName"]
    desk = permission["Resource"]["TableWithColumns"]["Name"]
    elif "Desk" in  permission["Resource"]:
    catalog=permission["Resource"]["Table"]["CatalogId"]
    database = permission["Resource"]["Table"]["DatabaseName"]
    desk = permission["Resource"]["Table"]["Name"]
    transformed_table_permissions.append( group + "," + catalog + "," + database + "," + desk)
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk)
    elif "DataCellsFilter" in permission["Resource"]:
    catalog=permission["Resource"]["DataCellsFilter"]["TableCatalogId"]
    database = permission["Resource"]["DataCellsFilter"]["DatabaseName"]
    desk = permission["Resource"]["DataCellsFilter"]["TableName"]
    filter = permission["Resource"]["DataCellsFilter"]["Name"]
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk+ ","+ filter)

The next determine is an instance of the extracted knowledge filter and desk permissions. AWS Lake Formation can have knowledge filters utilized to any principal. Nonetheless, we concentrate on the Amazon QuickSight principals:

  • The QuickSight_Authors ARN has full entry to 2 tables. That is decided by remodeling the table-level permissions along with the info filter permissions.
  • The QuickSight_Readers ARN has restricted entry based mostly on filter circumstances.

Retailer the remodeled guidelines and permissions in two separate recordsdata in Amazon S3

The remodeled guidelines and permissions are then continued in a knowledge retailer. On this answer, the remodeled guidelines are written to an Amazon S3 location in CSV format. The title of the recordsdata created by the Lambda perform are:

  • transformed_filter_permissions.csv
  • transformed_filter_rules.csv

See the next code:

with open("/tmp/transformed_table_permissions.csv", "w") as txt_file:
for line in transformed_table_permissions:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()
s3 = boto3.useful resource('s3')
s3.meta.consumer.upload_file(Filename = "/tmp/transformed_table_permissions.csv", Bucket= os.environ['S3Bucket'], Key = "table-permissions/transformed_table_permissions.csv")

with open("/tmp/transformed_filter_permissions.csv", "w") as txt_file:
for line in transformed_filter_permissions:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()

s3.meta.consumer.upload_file(Filename = "/tmp/transformed_filter_permissions.csv", Bucket= os.environ['S3Bucket'], Key = "filter-permissions/transformed_filter_permissions.csv")

with open("/tmp/transformed_filter_rules.csv", "w") as txt_file:
for line in complete_transformed_filter_rules:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()

s3.meta.consumer.upload_file(Filename = "/tmp/transformed_filter_rules.csv", Bucket= os.environ['S3Bucket'], Key = "filter-rules/transformed_filter_rules.csv")

Create a guidelines dataset in Amazon QuickSight

On this part, we stroll via the steps to create a guidelines dataset in Amazon QuickSight.

Create a desk in Lake formation for the recordsdata

Step one is to create a desk in AWS Lake Formation for the 2 recordsdata, transformed_filter_permissions.csv and transformed_filter_rules.csv.

Though you’ll be able to instantly use an Amazon S3 connector in Amazon QuickSight, making a desk and making the foundations dataset utilizing an Athena connector provides flexibility in writing customized SQL and utilizing direct question. For the steps to deliver an Amazon S3 location into AWS Lake Formation, see Creating tables.

For this publish, the tables for the recordsdata are created in a separate database known as quicksight_lf_transformation.

Grant permission for the tables to the QuickSight_Authors group

Grant permission in AWS Lake Formation for the 2 tables to the QuickSight_Authors group. That is important for Amazon QuickSight authors to create a guidelines dataset in Amazon QuickSight. The next screenshot reveals the permission particulars.

Create a guidelines dataset in Amazon QuickSight

Amazon QuickSight helps each user-level and group-level RLS. On this publish, we use teams to allow RLS. To create the foundations dataset, you first be a part of the filter permissions desk with the filter guidelines desk on the columns catalog, database, desk, and filter. Then you’ll be able to filter the permissions to incorporate the Amazon QuickSight principals, and embody solely the columns required for the dataset. The target on this answer is to construct a guidelines dataset for the saas_sales desk.

Full the next steps:

  1. On the Amazon QuickSight console, create a brand new Athena dataset.
  2. Specify the next:
    1. For Catalog, select AWSDataCatalog.
    2. For Database, select quicksight_lf_transformation.
    3. For Desk, select filter_permissions.
  3. Select Edit/Preview knowledge.
  4. Select Add knowledge.
  5. Select Add supply.
  6. Choose Athena.
  7. Specify the next:
    1. For Catalog, select AWSDataCatalog.
    2. For Database, select quicksight_lf_transformation.
    3. For Desk, select filter_rules.

  8. Be part of the permissions desk with the info filter guidelines desk on the catalog, database, desk and filter columns.
  9. Rename the column group as GroupArn. This must be performed earlier than filter is utilized.
  10. Filter the info the place column desk equals saas_sales.
  11. Filter the info the place column group can be filtered for values beginning with arn:aws:quicksight (Amazon QuickSight principals).
  12. Exclude fields that aren’t a part of the saas_sales desk.
  13. Change Question mode to SPICE.
  14. Publish the dataset.

In case your group has a mapping of different principals to a Amazon QuickSight group or person, you’ll be able to apply that mapping earlier than becoming a member of the tables.

You may also write the next customized SQL to attain the identical outcome:

SELECT a."group" as GroupArn, phase FROM "QuickSight_lf_transformation"."filter_permissions" as a
left be a part of
"QuickSight_lf_transformation"."filter_rules" as b
on
a.catalog = b.catalog and
a.database = b.database and
a."desk" = b."desk" and
a.filter = b.filter
the place a."desk" = 'saas_sales'
and a."group" like 'arn:aws:quicksight%'

  1. Title the dataset LakeFormationRLSDataSet and publish the dataset.

Take a look at the row-level safety

Now you’re prepared to check the row-level safety by publishing a dashboard as a person within the QuickSight_Authors group after which viewing the dashboard as a person within the QuickSight_Readers group.

Publish a dashboard as a QuickSight_Authors group person

As an writer who belongs to the QuickSight_Authors group, the person will be capable to see the saas_sales desk within the Athena connector and all the info within the desk. As proven on this part, all three segments are seen for the writer when creating an evaluation and viewing the printed dashboard.

  1. Create a dataset by pulling knowledge from the saas_sales desk utilizing the Athena connector.
  2. Connect LakeFormationRLSDataSet because the RLS dataset for the saas_sales dataset. For directions, see Utilizing row-level safety with user-based guidelines to limit entry to a dataset.
  3. Create an evaluation utilizing the saas_sales dataset as an writer who belongs to the QuickSight_Authors group.
  4. Publish the dashboard.
  5. Share the dashboard with the group QuickSight_Readers.

View the dashboard as a QuickSight_Readers group person

Full the next steps to view the dashboard as a QuickSight_Readers group person:

  1. Log into Amazon QuickSight as a reader who belongs to the QuickSight_Readers group.

The person will be capable to see solely the phase Enterprise.

  1. Now, change the RLS in AWS Lake Formation, and set the phase to be SMB for the QuickSightReaderFilter.
  2. Run the Lambda perform to export and rework the brand new knowledge filter guidelines.
  3. Refresh the SPICE dataset LakeFormationRLSDataSet in Amazon QuickSight.
  4. When the refresh is full, refresh the dashboard within the reader login.

Now the reader person will see SMB knowledge.

Cleanup

Amazon QuickSight sources

  1. Delete the Amazon QuickSight dashboard and evaluation created
  2. Delete the datasets saas_sales and LakeFormationRulesDataSet
  3. Delete the Athena knowledge supply
  4. Delete the QuickSight teams utilizing the DeleteGroup API

AWS Lake Formation sources

  1. Delete the database quicksight_lf transformation created in AWS Lake Formation
  2. Revoke permission given to the Lambda execution function
  3. Delete the saas_sales desk and knowledge filters created
  4. When you have used Glue crawler to create the tables in AWS Lake Formation, take away the Glue crawler as nicely

Compute sources

  1. Delete the AWS Lambda perform created
  2. Delete the AWS Lambda execution function related to the lambda

Storage sources

  1. Empty the content material of the Amazon S3 bucket created for this answer
  2. Delete the Amazon S3 bucket

Conclusion

This publish defined the best way to replicate row-level safety in AWS Lake Formation mechanically in Amazon QuickSight. This makes positive that the SPICE dataset in QuickSight can use row-level entry outlined in Lake Formation.

This answer may also be prolonged for different knowledge sources. The logic to programmatically extract the entitlements from the supply and rework them into Amazon QuickSight format will fluctuate by supply. After the extract and rework are in place, it could scale to a number of groups within the group. Though this publish laid out a fundamental method, the automation needs to be both scheduled to run periodically or triggered based mostly on occasions like knowledge filters change or grant or revoke of AWS Lake Formation permissions to make it possible for the entitlements stay in sync between AWS Lake Formation and Amazon QuickSight.

Check out this answer in your personal use case, and share your suggestions within the feedback.


In regards to the Authors

Vetri Natarajan is a Specialist Options Architect for Amazon QuickSight. Vetri has 15 years of expertise implementing enterprise enterprise intelligence (BI) options and greenfield knowledge merchandise. Vetri focuses on integration of BI options with enterprise purposes and allow data-driven choices.

Ismael Murillo is a Options Architect for Amazon QuickSight. Earlier than becoming a member of AWS, Ismael labored in Amazon Logistics (AMZL) with supply station administration, supply service suppliers, and our buyer actively within the discipline. Ismael centered on final mile supply and supply success. He designed and applied many revolutionary options to assist cut back price, affect supply success. He’s additionally a United States Military Veteran, the place he served for eleven years.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles