Snowflake to BigQuery migration  |  Google Cloud (2024)

This document provides a technical background on how to migrate data fromSnowflake toBigQuery. It covers the foundational differences between Snowflake andBigQuery. It also provides guidance for a successful migration,such as the following:

  • What schema changes are needed
  • What migration tools and options are available
  • How to migrate data (using a sample export process)

You can also usebatch SQL translationto migrate your SQL scripts in bulk, orinteractive SQL translationto translate ad hoc queries. Snowflake SQL is supported by bothtools in preview.

Terminology

This document uses Snowflake and BigQuery terminology todescribe the capabilities that each product provides. The following table mapsSnowflake terms to equivalent BigQuery terms:

Snowflake BigQuery
Database Dataset
Schema Schema
Session-specific temporary or transient table Anonymous or temporary table
View View
Secure views Authorized Views
Virtual warehouse Reservation
Materialized view Materialized view
No equivalent for partitioning (because micro-partitioning is used) Partitioning
Clustering Clustering
Security-enhanced user-defined functions (UDFs) Authorized UDFs

Architectural comparison

Snowflake and BigQuery are both analytic data warehouses, butthey have some key architectural differences.

The architecture in Snowflake a hybrid of shared-disk databasearchitectures and shared-nothing database architectures. As with shared-nothing architectures, data in Snowflake is managed in a separated cloud object storage service. As with a shared-disk architecture, queries in Snowflake use dedicated compute clusters. In Snowflake, each cluster manages cached portions of the entire dataset to accelerate query performance.For more information, seeSnowflake architecture.

BigQuery's architecture is vastly different fromnode-based cloud data warehouse solutions or MPP systems. It decouples storageand compute, which allows them to scale independently on demand. For moreinformation, seeBigQuery under the hood.

User interface comparison

The Snowflake web interface mirrors the Snowflake command line interface (CLI).Both interfaces let you do the following:

  • Manage databases
  • Manage warehouses
  • Manage queries and worksheets
  • View historical queries

The web interface also lets you manage your Snowflake password and userpreferences.

The Snowflake CLI client uses SnowSQL for connecting to Snowflake to run SQLqueries and other operations.

The BigQuery interface is built into theGoogle Cloud console and contains a list of BigQuery resources that you can view:

  • The BigQuery Studio section displays your datasets, tables, views, andother BigQuery resources. This is where you can create andrun queries, work with tables and views, see your BigQueryjob history, and perform other common BigQuery tasks.
  • The Data transfers section opens the BigQuery Data Transfer Service page.
  • The Scheduled queries section displays your scheduled queries.
  • The Capacity management section displays slot commitments, reservations,and reservation assignments.
  • The BI Engine section opens the BigQuery BI Engine page.

BigQuery also has a command-line tool which is Python-based. Formore information, see,using the bq command-line tool.

Security

When migrating from Snowflake to BigQuery, you must consider theway that Google Cloud in general, and BigQuery in particular,handles security differently from Snowflake.

Snowflake hasvarious security-related features like the following:

  • Network and site access
  • Account and user authentication
  • Object security
  • Data security
  • Security validations

Security on Snowflake is based on your cloud provider's features. It providesgranular control over access to objects, object operations, and who can createor alter access control policies.

The BigQuery parallel to the access control privileges inSnowflake areIdentity and Access Management (IAM) roles in Google Cloud.These privileges determine the operations that are allowed on a resource.Privileges are enforced at a Google Cloud level.

Encryption

In Snowflake, column-level security is supported in the Enterprise edition, andcustomer-managed encryption keys are supported in the Business Critical edition.These editions have different pricing. In BigQuery, all featuresand enhanced security measures are offered as standard features at no additionalcost.

Snowflake provides end-to-end encryption in which itautomatically encrypts all stored data.Google Cloud provides the same feature by encrypting all data atrest and intransit by default.

Similar to Snowflake Business Critical edition, BigQuery supportscustomer-managed encryption keys for users who want to control and manage key encryption keys inCloud Key Management Service.BigQuery also allowscolumn-level encryption.For more information about encryption in Google Cloud, seeEncryption at rest in Google Cloud andEncryption in transit in Google Cloud.

Roles

Roles are the entities to which privileges on securable objects can be grantedand revoked.

Snowflake supports the following two types of roles:

  • System-defined roles:These roles consist of system and security-related privileges and arecreated with privileges related to account management.
  • Custom roles:You can create these roles by using the SECURITYADMIN roles or by using anyrole that has the CREATE ROLE privilege. Each custom role in Snowflake iscomposed of privileges.

In IAM, permissions are grouped into roles. IAMprovides three types of roles:

  • Basic roles:These roles include the Owner, Editor, and Viewer roles. You can applythese roles at the project or service resource levels by using theGoogle Cloud console, the Identity and Access Management API, or the gcloud CLI. In general,for the strongest security, we recommend that you useBigQuery-specific roles to follow the principle of leastprivilege.
  • Predefined roles:These roles provide more granular access to features in a product (such asBigQuery) and are meant to support common use cases andaccess control patterns.
  • Custom roles:These roles are composed of user-specified permissions.

Access control

Snowflake lets you grant roles to other roles, creating a hierarchy of roles.IAM doesn't support a role hierarchy but implements a resourcehierarchy.The IAM hierarchy includes the organization level, folder level, project level, and resourcelevel. You can set IAM roles at any level of the hierarchy, andresources inherit all the policies of their parent resources.

Both Snowflake and BigQuery support table-level access control.Table-level permissions determine the users, groups, and service accounts thatcan access a table or view. You can give a user access to specific tables orviews without giving the user access to the complete dataset.

Snowflake also usesrow-level security andcolumn-level security.

In BigQuery, IAM providestable-level access control.For more granular access, you can also usecolumn-level access control orrow-level security.This type of control provides fine-grained access to sensitive columns by usingpolicy tags or type-based data classifications.

You can also createauthorized views to limit data access for more fine-grained access control so that specifiedusers can query a view without having read access to the underlying tables.

Things to consider when migrating

There are a few Snowflake features that you cannot port directly toBigQuery. For example, BigQuery doesn't offerbuilt-in support for the following scenarios. In these scenarios, you might needto use other services in Google Cloud.

  • Time travel: In BigQuery, you can usetime travel to access data from any point within the last seven days. If you need toaccess data beyond seven days,consider exporting regularly scheduled snapshots.Snowflake lets you access historical data (data that has been changed ordeleted) at any point within a defined period. You can set this period atany value from 0 to 90 days.

  • Streams: BigQuery supportschange data capture (CDC) withDatastream.You can also use CDC software, likeDebezium,to write records to BigQuery withDataflow.For more information on manually designing a CDC pipeline withBigQuery, seeMigrating data warehouses to BigQuery: Change data capture (CDC).In Snowflake, a stream object records data manipulation language changesmade to tables and also metadata about each change so that you can takeactions with the changed data.

  • Tasks: BigQuery lets you schedule queries and streamsor stream integration into queries withDatastream.Snowflake can combine tasks with table streams for continuous extract,load, and transfer workflows in order to process recently changedtable rows.

  • External functions: BigQuery supportsexternal function calls through Cloud Run functions. However, you can also use user-defined functions (UDF)like SQL UDF,although these functions aren't executed outside of BigQuery.In Snowflake, an external function calls code that runs outside Snowflake.For example, information sent to a remote service is usually relayed througha proxy service.

Migrate data from Snowflake to BigQuery

This section describes how to configure and initiate your migration fromSnowflake to BigQuery based on the framework outlined inMigrating data warehouses to BigQuery: What and how to migrate.

Architecture

To start the migration, you run both Snowflake and BigQuery. Thefollowing diagram shows an architecture that minimally affects existingoperations. By transferring clean, quality-controlled data, you can reuseexisting tools and processes while offloading workloads to BigQuery. You canalso validate reports and dashboards against earlier versions. Nevertheless, becauseOLAP data is maintained in redundant places, this operation isn't costeffective. It also extends the processing time.

  • Point 1 shows data moving from Snowflake to Cloud Storage.
  • Point 2 shows the persistence of the data to BigQuery.
  • Point 3 shows how the data is sent to the end user.

You can validate reports and dashboards against old iterations. For moreinformation, seeMigrating data warehouses to BigQuery: Verify and validate.

Snowflake to BigQuery migration | Google Cloud (1)

The final architecture for your data warehouse migration has all the data fromsource systems directly persisted in Google Cloud. Depending on the numberand complexity of the source systems, delivering this architecture can befurther staged by addressing source systems one at a time according to priority,interdependencies, integration risks, or other business factors.

The following diagram assumes the migration of data pipelines and ingestion toGoogle Cloud.

  • Point 1 shows both synchronous and asynchronous integration points.Synchronous integration is, for example, between data sources andApp Engine when dealing with use cases that require explicit user actionsas a part of the flow.
  • Point 2 shows usingPub/Sub for large volumes of concurrent event data.
  • Point 3 shows the persistence of data using one or moreGoogle Cloud products, depending on the nature of the data.
  • Point 4 shows the extract, transform, and load (ETL) process toBigQuery.

Snowflake to BigQuery migration | Google Cloud (2)

Prepare your Cloud Storage environment

Google Cloud offers several ways to transfer your data toBigQuery using other ETL tools. The pattern is as follows:

  1. Extract the data from your source: Copy the extracted files fromyour source into staging storage in your on-premises environment. For moreinformation, seeMigrating data warehouses to BigQuery: Extracting the source data.

  2. Transfer data to a staging Cloud Storage bucket: After youfinish extracting data from your source, you transfer it to a temporarybucket in Cloud Storage. Depending on the amount of data you'retransferring and the network bandwidth available, youhave several options.

    It's important to ensure that the location of yourBigQuery dataset and your external data source, or yourCloud Storage bucket, are in the same region. For more informationon geographical location considerations for loading data fromCloud Storage, seeBatch loading data.

  3. Load data from the Cloud Storage bucket intoBigQuery: Your data is now in a Cloud Storagebucket, closer to its destination. There are several options toupload the data into BigQuery. Those options depend on howmuch the data has to transform. Alternatively, you can transform your datawithin BigQuery by following the ETL approach.

    When you import your data in bulk from a JSON file, an Avro file, or aCSV file, BigQuery auto-detects the schema, so you don'tneed to predefine it. To get a detailed overview of the schema migrationprocess for EDW workloads, seeSchema and data migration process.

Supported data types, properties, and file formats

Snowflake and BigQuery support most of the same data types,though they sometimes use different names. For a complete list of supported datatypes in Snowflake and BigQuery, see the Data types section ofthe SnowflakeSQL translation reference. You can also use the batch SQL translator to translateFor more information about BigQuery's supported data types, seeGoogleSQL data types.

Snowflake can export data in the following file formats. You can load theformats directly into BigQuery:

  • CSV: SeeLoading CSV data from Cloud Storage.
  • Parquet: SeeLoading Parquet data from Cloud Storage.
  • JSON (newline-delimited): SeeLoading JSON data from Cloud Storage.

Schema changes

If you are planning schema changes in your migration toBigQuery, we recommend that you first migrate your schema as-is.BigQuery supports a wide range of data model design patterns,such asstar schema orSnowflake schema.Because of this support, you don't need to update your upstream data pipelinesfor a new schema, and you can use automated migration tools to transfer yourdata and schema.

Updating a schema

After your data is in BigQuery, you can always make updates tothe schema, such as adding columns to the schema definition or relaxing acolumn's mode from REQUIRED to NULLABLE.

Remember that BigQuery uses case-sensitive naming conventionsfor the table name, whereas Snowflake uses case-insensitive naming patterns.This convention means that you might need to revisit any inconsistencies in thetable-naming conventions that might exist in Snowflake, and rectify anyinconsistencies that arose during the move to BigQuery. For moreinformation on schema modification, seeModifying table schemas.

Some schema modifications aren't directly supported in BigQueryand require manual workarounds, including the following:

  • Changing a column's name.
  • Changing a column's data type.
  • Changing a column's mode (except for relaxing REQUIRED columns toNULLABLE).

For specific instructions on how to manually implement these schema changes,seeManually change table schemas.

Optimization

After the schema migration, you can test performance and make optimizationsbased on the results. For example, you can introduce partitioning to make yourdata more efficient to manage and query. Partitioning in BigQueryrefers to a special table that is divided into segments called partitions.Partitioning is different from the micro-partitioning in Snowflake, which happensautomatically as data is loaded. BigQuery's partitioning lets youimprove query performance and cost control by partitioning by ingestion time,timestamp, or integer range. For more information, seeIntroduction to partitioned tables.

Clustered tables

Clustered tables are another schema optimization. BigQuery, likeSnowflake, lets you cluster tables, enabling you to automatically organize tabledata based on the contents of one or more columns in the table's schema.BigQuery uses the columns that you specify to colocate relateddata. Clustering can improve the performance of certain types of queries, suchas queries that use filter clauses or queries that aggregate data. For moreinformation on how clustered tables work in BigQuery, seeIntroduction to clustered tables.

The following list describes the tools that you can use to migrate data fromSnowflake to BigQuery. These tools are combined in theExamples of migration using pipelines section to put together end-to-end migration pipelines.

  • COPY INTO <location> command:Use this command in Snowflake to unload data from a Snowflake tabledirectly into a specified Cloud Storage bucket. For an end-to-endexample, seeSnowflake to BigQuery (snowflake2bq) on GitHub.
  • Apache Sqoop:To extract data from Snowflake into either HDFS or Cloud Storage,submit Hadoop jobs with the JDBC driver from Sqoop and Snowflake. Sqoop runs in aDataproc environment.
  • Snowflake JDBC:Use this driver with most client tools or applications that support JDBC.

You can use the following generic tools to migrate data from Snowflake toBigQuery:

  • BigQuery Data Transfer Service:Perform an automated batch transfer of Cloud Storage data intoBigQuery with this fully managed service. This tool requiresyou to first export the Snowflake data to Cloud Storage.
  • The Google Cloud CLI:Copy downloaded Snowflake files into Cloud Storage with thiscommand-line tool.
  • bq command-line tool: Interact with BigQuery using this command-line tool.Common use cases include creating BigQuery table schemas,loading Cloud Storage data into tables, and running queries.
  • Cloud Storage client libraries:Copy downloaded Snowflake files into Cloud Storage with acustom tool that uses the Cloud Storage client libraries.
  • BigQuery client libraries:Interact with BigQuery with a custom tool built ontop of the BigQuery client library.
  • BigQuery query scheduler:Schedule recurring SQL queries with this built-in BigQueryfeature.
  • Cloud Composer:Use this fully managed Apache Airflow environment to orchestrateBigQuery load jobs and transformations.

For more information on loading data into BigQuery, seeLoading data into BigQuery.

Examples of migration using pipelines

The following sections show examples of how to migrate your data from Snowflaketo BigQuery using three different techniques: extract and load,ETL, and partner tools.

Extract and load

The extract and load technique offers two methods:

  • Use a pipeline to unload data from Snowflake
  • Use a pipeline and a JDBC driver to export data from Snowflake

Use a pipeline to unload data from Snowflake

Tounload data from Snowflake directly into Cloud Storage (recommended), or todownload data and copy it to Cloud Storage using the gcloud CLI or theCloud Storage client libraries, use the snowflake2bq tool to migratedata using the Snowflake COPY INTO <location> command.

You then load Cloud Storage data into BigQuery with oneof the following tools:

  • BigQuery Data Transfer Service
  • bq command-line tool
  • BigQuery API Client Libraries

Use a pipeline and a JDBC driver to export data from Snowflake

Use any of the following products to export Snowflake data with the JDBC driverfrom Snowflake:

  • Dataflow
    • JDBC to BigQuery template
  • Cloud Data Fusion
    • JDBC drivers
  • Dataproc

Extract, transform, and load

If you want to transform your data before loading it intoBigQuery, you can add a transformation step in the pipelinesdescribed in the preceding Extract and load section.

Transform Snowflake data

To transform your data before loading it into BigQuery, eitherunload data directly from Snowflake to Cloud Storage or use thegcloud CLI to copy data over, as described in the precedingExtract and load section.

Load Snowflake data

After transforming your data, load your data into BigQuery withone of the following methods:

Use a pipeline and a JDBC driver to transform and export data from Snowflake

Add a transformation step in the following pipeline options as described in the precedingExtract and load section.

  • Dataflow
  • Cloud Data Fusion
  • Dataproc
    • Transform your data usingSpark SQL or custom code in any of the supported Spark languages (Scala, Java,Python, or R).

You might have an extract, load, and transform use case to load your data fromSnowflake into BigQuery and then transform it. To perform thistask, load your data from Snowflake to a BigQuery staging tableby using one of the methods in the preceding Extract and load section. Then, run SQL queries on the staging table and write the output to the finalproduction table in BigQuery.

Partner tools for migration

There are multiple vendors that specialize in the EDW migration space. For alist of key partners and their provided solutions, seeGoogle Cloud's BigQuery partner website.

Examples of the export process

The following sections show a sample data export from Snowflake toBigQuery that uses the COPY INTO <location> Snowflake command.For a detailed, step-by step process that includes code samples, seetheGoogle Cloud professional services Snowflake to BigQuery tool.

Prepare for the export

For the unload, use Snowflake SQL statements to create anamed file format specification.

This tutorial uses my_parquet_unload_format for the file format, but you canuse a different name.

 create or replace file format my_parquet_unload_format type = 'PARQUET' field_delimiter = '|'

Export your Snowflake data

After you prepare your data, you need to move the data to Google Cloud.You can do this step in one of the two following ways:

  1. Exporting your data directly to Cloud Storage from Snowflake.
  2. Staging your Snowflake data in either an Amazon Simple Storage Service(Amazon S3) bucket or Azure Blob Storage.

To avoid an extra data hop, directly export your data.

Export Snowflake data directly to Cloud Storage

The following instructions show how to use the Snowflake COPY command to unload data from Snowflake to Cloud Storage:

  1. In Snowflake,configure a storage integration object to let Snowflake write to a Cloud Storage bucket referenced inan external Cloud Storage stage.

    This step involves several substeps.

    1. Create an integration with theCREATE STORAGE INTEGRATION command:

      create storage integration gcs_int type = external_stage storage_provider = gcs enabled = true storage_allowed_locations = ('gcs://mybucket/unload/')
    2. Retrieve the Cloud Storage service account for Snowflake with theDESCRIBE INTEGRATION command and grant the service account permissions to access theCloud Storage bucket that is selected as the staging area:

      desc storage integration gcs_int;
      +-----------------------------+---------------+-----------------------------------------------------------------------------+------------------+| property | property_type | property_value | property_default |+-----------------------------+---------------+-----------------------------------------------------------------------------+------------------|| ENABLED | Boolean | true | false || STORAGE_ALLOWED_LOCATIONS | List | gcs://mybucket1/path1/,gcs://mybucket2/path2/ | [] || STORAGE_BLOCKED_LOCATIONS | List | gcs://mybucket1/path1/sensitivedata/,gcs://mybucket2/path2/sensitivedata/ | [] || STORAGE_GCP_SERVICE_ACCOUNT | String | service-account-id@project1-123456.iam.gserviceaccount.com | |+-----------------------------+---------------+-----------------------------------------------------------------------------+------------------+
    3. Create an external Cloud Storage stage referencing theintegration that you created with theCREATE STAGE command:

      create or replace stage my_ext_unload_stage url='gcs://mybucket/unload' storage_integration = gcs_int file_format = my_parquet_unload_format;
  2. Use the COPY INTO <location> command to copy data from theSnowflake database table into a Cloud Storage bucket by specifyingthe external stage object you created in the previous step:

    copy into @my_ext_unload_stage/d1from mytable;

Export Snowflake data to Cloud Storage through Storage Transfer Service from Amazon S3

The following example shows how tounload data from a Snowflake table to an Amazon S3 bucket by using the COPY command:

  1. In Snowflake,configure a storage integration object to allow Snowflake to write to an Amazon S3 bucket referenced in anexternal Cloud Storage stage.

    This step involvesconfiguring access permissions to the Amazon S3 bucket,creating the AWS IAM role,and creating a storage integration in Snowflake with the CREATE STORAGE INTEGRATION command:

    create storage integration s3_int type = external_stage storage_provider = s3 enabled = true storage_aws_role_arn = 'arn:aws:iam::001234567890:role/myrole' storage_allowed_locations = ('s3://unload/files/')
  2. Retrieve the AWS IAM user with theDESCRIBE INTEGRATION command:

    desc integration s3_int;
    +---------------------------+---------------+================================================================================+------------------+| property | property_type | property_value | property_default |+---------------------------+---------------+================================================================================+------------------|| ENABLED | Boolean | true | false || STORAGE_ALLOWED_LOCATIONS | List | s3://mybucket1/mypath1/,s3://mybucket2/mypath2/ | [] || STORAGE_BLOCKED_LOCATIONS | List | s3://mybucket1/mypath1/sensitivedata/,s3://mybucket2/mypath2/sensitivedata/ | [] || STORAGE_AWS_IAM_USER_ARN | String | arn:aws:iam::123456789001:user/abc1-b-self1234 | || STORAGE_AWS_ROLE_ARN | String | arn:aws:iam::001234567890:role/myrole | || STORAGE_AWS_EXTERNAL_ID | String | MYACCOUNT_SFCRole= | |+---------------------------+---------------+================================================================================+------------------+
  3. Grant the AWS IAM user permissions to access the AmazonS3 bucket, and create an external stage with theCREATE STAGE command:

     create or replace stage my_ext_unload_stage url='s3://unload/files/' storage_integration = s3_int file_format = my_parquet_unload_format;
  4. Use the COPY INTO <location> command to copy the data from the Snowflakedatabase into the Amazon S3 bucket byspecifying the external stage object that you created earlier:

     copy into @my_ext_unload_stage/d1 from mytable;
  5. Transfer the exported files into Cloud Storage by usingStorage Transfer Service.

Export Snowflake data to Cloud Storage through other cloud providers:

Azure Blob StorageFollow the steps detailed inUnloading into Microsoft Azure.Then, transfer the exported files into Cloud Storage by usingStorage Transfer Service.

Amazon S3 BucketFollow the steps detailed inunloading into Amazon S3.Then, transfer the exported files into Cloud Storage by usingStorage Transfer Service.

What's next

  • Post migrationperformance and optimization.
  • Explore reference architectures, diagrams, and best practices about Google Cloud.Take a look at ourCloud Architecture Center.
Snowflake to BigQuery migration  |  Google Cloud (2024)
Top Articles
Your Complete Guide to Dealing With Collections and Charge-Offs on Your Credit Report | The Motley Fool
Mindful eating: A key to better health and weight loss
Bank of America Routing Numbers and Wire Transfer Instructions
Silnafil 25mg Tablet: View Uses, Side Effects, Price and Substitutes | 1mg
Onerealtycorp.com Search Results
Post Game Contents and 100 Percent Guide | Zelda: Skyward Sword HD (Switch)|Game8
[H] Wasteland 3, Cuphead, Resident Evil 7, Cloudpunk, Firewatch, Heavy Rain [W] Offers, Skins
7 Best Dialysis Technician Resume Examples for 2024
How To Breed A Loot Dragon In Dragonvale
Lorain County Busted Mugshots
Pickapart Santa Fe Springs
Georgia Southern vs. Ole Miss Prediction and Picks - September 21, 2024
Knox Horizon Complete Auto Care Reviews
The Penguin Episode 1 Recap & Ending Explained: Who Dies? - Films/Movies & reviews news - NewsLocker
Wyoming Roads Cameras
Skip The Games Anchorage
When Does Fortnite Downtime End
Osborn-Checkliste bzw. -Methode: Anleitung und Beispiele
The Best Online Pharmacies in Kenya - The Best in Kenya
Jaguar XJ gebraucht kaufen bei AutoScout24
Dashmart Bloomington
Honquest Obituaries
Exzellente Überseelogistik „Made in Austria“
Craigslist Portland Oregon Motorcycles
Go.movies.com Wednesday
Mchoul Funeral Home Of Fishkill Inc. Services
Craigslist Musicians Delaware
Synonym For Saint Word Craze
Mikayla Champion Leaked Video
Kfc Menu Open Now
Www Craigslist Com Corpus Christi
Mamasan Massage
Aabb Investorshub
Antiterrorism Level 1 Pretest Answers
Epiq Document Delivery
My Vidant Chart
Skipthe Games.com
Coors Field Seats In The Shade
Sarah Button Leaks
Goodwoods British Market Friendswood
Rawdogriley
Why rivalry match between Pitt and Penn State volleyball is bigger than the Xs and Os
Terrier Hockey Blog
Displacement avec Danielle Akini (Scrum master)
Obituaries | Elhatton's Funeral Home Ltd.
Vehicle Upgrade Console
Houston Max80
Uis St Johns
Sarah Dreyer Obituary
Hmnu Stocktwits
Covington Va Craigslist
Do sprzedania Zenith Captain Power Reserve Elite za cene 11 124 zł od Seller na Chrono24
Latest Posts
Article information

Author: The Hon. Margery Christiansen

Last Updated:

Views: 5881

Rating: 5 / 5 (70 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: The Hon. Margery Christiansen

Birthday: 2000-07-07

Address: 5050 Breitenberg Knoll, New Robert, MI 45409

Phone: +2556892639372

Job: Investor Mining Engineer

Hobby: Sketching, Cosplaying, Glassblowing, Genealogy, Crocheting, Archery, Skateboarding

Introduction: My name is The Hon. Margery Christiansen, I am a bright, adorable, precious, inexpensive, gorgeous, comfortable, happy person who loves writing and wants to share my knowledge and understanding with you.