Data Analyst – Market/Risk Analysis and Financial Analysis

/
/
Data Analyst – Market/Risk Analysis and Financial Analysis
ID : 1448
Education level  : Master's degree
Work experience level  : Expert- (more than year 7)
Work experience in total  : Years
Job type  : Online
Job time  : Maandelijks
Last date of registration :
2023-01-02
Profile description

Experienced SAS Programmer Analyst and Statistical Modeler with 10+ years of experience in data analysis including data extraction, manipulation and validation techniques, writing macros and reporting on various projects for Market/Risk Analysis and Financial Analysis, with handful experience in statistical modeling.

Work experience In details :
Job position
Job description

Senior Consultant, Technology - Virtusa, Colombo, Sri Lanka - (April 2022 to Present)

  • Preparing project proposals for SAS migrating jobs, mentoring SAS Developers Preparing Business proposals, and architecture documentation for potential new migration projects Mentoring junior programmers Learning Azure, Python Machine Learning 

 

Senior Software Engineer-SAS Architect -  Bureau of Labor Statistics, Department of Labor -  (Jan 2020 to Feb 2021)

  • Support the activities of consumer expenditure survey division.
  • Assist in the design, implementation and maintenance of SAS Statistical subsystem business application process Consumer Expenditure Survey Data (CEIS) division at BLS (Bureau of Labor Statistics) and deliver production data that meets complex business rules for the 2020 Consumer Expenditure Survey.
  • Increase system efficiency of data processing procedures through the creation and utilization of SAS Macro functions on CE (Consumer Expenditure) survey data.
  • Respond to client’s request to provide information, reports and detailed data. 
  • Analyzing business requirements, and converting those into business rules, detailed flow charts functional specifications and design documents.
  • Developing SAS Macro programs using macro functions to automate business as usual reporting process to improve process efficiency.

Lead SAS Modeler/Quantitative analyst -Key Bank, Cleveland OH, - (September 2018 – December  2019)

  • Supported team of Quantitative analyst on mortgage as a SAS resource in reaching CCAR Stress testing internal deadlines and built Data Mart for Anti Money Laundering (AML) operations team.
  • Standardized the macros and programs used to extract and predict Payoff rate, Default rate, Loss Rate and Cumulative loss rate using accounting and economic variables.
  • Developed macros to do sensitivity analysis- stress test by changing economic variables such as unemployment rate and housing price index.
  • Developed programs and predicted key performance index such as Payoff rate, Default rate, Loss Rate and Cumulative loss rate under Base, Adverse and severely adverse conditions.
  • Analyzed and compared actual logit values and predicted logit values against Performance Date, Performance Age and Month Since Snapshot.
  • Created bivariate plots for comparing actual as well as model logit values with credit score variables and Loan to value variables.
  • Reviewed, Modified and updated SQL queries used for OBIEE and Cognos business intelligence reports of AML operations. Which read data from DB2.
  • Created a Data Mart in Teradata through SAS, Created macros to access data from Hadoop environment, Wrote SQL Queries in hive to expedite SAS processes for AML operations.
  • Predicted productivity values of AML analysts using PROC Timeseries PROC AUTOREG fitting best autoregressive model individually with lag7 and lag14 values (heavily seasoned on weekly performance data), using single reusable macro over different portfolios without generating non convergence errors.

Assistant Manager, Senior Business Analytics (SAS) - Johnson and Johnson Vision, Jacksonville, FL -  (January 2018 – July 2018)

  • Responsible for calculating Reassigned sales and enter it to the SAP CRM.
  • The reassigned sales calculation is done by collecting data from retail, internet and lease-franchises and match with the sales reported by warehouse, dealers and distributors. 
  • Analyzed and updated Retail, Online and Franchise channel partners sales data into SAP CRM periodically depend on the frequency of reporting using SAS.
  • Improved and updated the SAS programs used in the process.

 

SAS ETL Architect - Bunge LTD, White Plains, NY - (Jan 2017 – Jan 2018)

  • Maintain the SAS Risk Dimensions and support the Risk Analysis team with the daily ETL process in Production. Maintain Model and Development environments with the help of Designated Admin using SVN Tortoise version control. The project originally intended to clean and update existing ETL program, which involve Oracle and Terada. I successfully achieved the goals in time, learned SAS Risk dimension and took additional responsibilities until the offshore maintenance and support team ready.
  • Interacted with the Business Users to gather Requirements and analyzed the mutual dependencies of their needs by coordinating with the Project Manager.
  • Documented the age old previous ETL system in data flow diagrams using MS Visio. Had a separate Brain storming session with Business Developed Data Flow Model which is Ideally expected by the Business.
  • Analyzed the differences had a brain storm sessions and Joint Application Development Sessions to identify and understand the limitations. With the help of SAS Architect helped to find the best solution and implemented it. 
  • Updated the Standard Operating Procedures for preparing Business Requirements Documents and Functional Requirement Documents for registering new prices, registering new products and changing prices for the existing products in the control tables.
  • Improved the age old ETL process by removing redundant processes which contains more than 20k lines of SAS codes and the processes used hash datasets and hundreds of macros extensively.  Updated and Maintain Development and Model Oracle Database. Monthly revised analyzed the requirements to improve the existing programs and procedures.
  • Automated Daily Reports of Positions and VaR. and Make sure the daily report of VaR (Value of Risk) and FX VaR generated and ready for review.
  • Developed new programs according to the new requirements such as Contribution VaR and Attribution in Development environment, check them in Model and Move them to Production using Tortoise SVN.
  • In addition to that I volunteered to do PFE (Potential Futures Exposure) calculations in Matlab. Quickly learned Matlab to handle the PFE calculation independently.

SAS (Visual Analytics) Developer / Data Scientist - chlumberger, Houston, TX - (January 2016 –  January 2017)

  • Worked in the Business intelligence team and data science team. The Data Science project was to identify the potential variable cause failures of Tools from a large number of variables. I was very successful and identified variables from thousands of variables. Instead of clustering I used macros extensively to plot to see patterns near failures. Further I was asked to work in developing Dashboards in SAS VA. I developed Dashboard with KPI inferred and can be compared in multi levels along with Global values.
  • Worked with Data Science team in Predicting tools failure using big data sets (billions of observations), by leveraging machine learning algorithms such as K Means Clustering, Hierarchical Clustering as well as Clustering variables and Decision trees, with PROC FASTCLUSTER, PROC CLUSTER, PROC VARCLUS,PROC HPLOGISTIC and PROC HPSPLIT (SAS High performance Procedures
  • Responsible for Reporting and Analytics using SAS and BI tools.
  • Profiling Client Data in SAS Enterprise Guide, UNIX environment.
  • Plot data over broken/customized time axis, to get rid of large swath of the non-activity in between, using PROC TEMPLATE, PROC SGRENDER .
  • Plot histograms of explanatory variables along with Average Predicted Values and Average Actual values over the bins, using PROC UNIVARIATE, PROC TEMPLATE, PROC SGRENDER, PROC SQL.
  • Create Macros to select variables, to build decision trees and to score data.
  • Create Visual Analytics Dashboard and update it regularly.
  • Code, test, debug, document and maintain SAS programs.
  • Extract data using SAS SQL procedures and write code to manipulate, aggregate and merge datasets
  • Design reporting base tables and develop data visualizations using SAS VA
  • Developed complex programming models to extract data and manipulated databases to support verification strategies for reporting/analysis.

Statistical Programmer - Puma Biotechnology, South San Francisco, CA - (February 2015 – September 2015)

  • Use PROC FREQ, PROC SQL Quarries to study data sets. Merge and Concatenate datasets using DATA STEPS and PROC SQL according to the necessity, transform wide form datasets to lean form and vice versa using PROC TRANSPOSE, Create MACRO variables using PROC SQL, CALL SYMPUT (in data step).
  • Clean legacy study data remove unnecessary or redundant variables and transform them to XPORT data sets.
  • Create ADaM, SDTM datasets referring Specifications, Create Specifications for ADaM datasets by referring protocol and statistical Analysis Plan.
  • Use PROC IMPORT and PROC EXPORT to import Excel and CSV files. Use DDE to import password protected files.
  • Create listings using PROC REPORT and RTF Programming from ADaM, SDTM, and RAW datasets depend on the feasibility.
  • Create and review defines documents, using MACROS, debug MACROS using MPRINT, MLOGIC and SYMPOLGEN, SAS transport files and patient profiles Using PROC COPY. Describe codes association for Adverse Events with MedDRA with the relevant version used and for Concomitant Medication with WHODRUG Dictionary.

Software Engineer - USAA San Antonio, TX - (July 2014 – December 2014)

  • Extracted required data in the needed form and create Excel Pivot Tables. Investigated patterns and correlations, on the progress and transition of customer choices.  Using Data Step, PROC SQL, PROC UNIVARIATE, PROC TRANSPOSE, PROC EXPORT, PROC IMPORT, PROC CORR, PROC LOGISTIC, PROC PRINCOMP.

 

Teaching Assistant (Instructor - Elementary Statistics I & II) - University of Cincinnati, Cincinnati, OH -  (September  2011 to May 2014)

Made sure the students not only understood but also excelled in exams and applications, I set homework and assessment which guide them to success. As a result 30% of the class passed with 'A's
Statistical Package Used: SAS, RText                                                                Book: “Introduction to the Practice of Statistics”, 7th Edition; by David S. Moore, George P. McCabe & Bruce Craig, Chapters 1-13.                           Topics covered

  • Descriptive Statistics Using PROC UNIVARIATE, PROC CORR, PROC MEANS, PROC FREQ, PROC BOXPLOT.
  • Introduction to inferential statistics and hypothesis testing for means and proportions.  Using PROC TTEST PROC SURVEYMEANS.
  • Introduction to Anova Using PROC ANOVA.
  • Linear Regression Using PROC REG.

Prepared Syllabus conducted evaluations and assigned grades independently.

Hard skills
  • SAS Tools
    • Base SAS
    • SAS/SQL
    • SAS/MACROS
    • SAS/STAT
    • SAS/GRAPH
    • SAS Risk Engine
    • SAS/ODS
    • SAS/ACCESS
    • SAS ETL
    • SAS/GRAPH
    • SAS Enterprise Guide
  • Programming Languages
    • Python
  • OS
    • Windows
    • Linux
    • Unix
  • Databases
    • Oracle
    • DB2
    • Teradata
    • MySQL
    • SQL Server
  • Analytics
    • SAS,
    • Python
    • Excel
  • BI tools
    • SAS Visual Analysis
    • Tableau
    • Power BI
  • Cloud Platforms
    • AWS
    • AZURE
    • GPC
Soft skills
Achievements
Special notes

SAS Tools: Base SAS, SAS/SQL, SAS/MACROS, SAS/STAT, SAS/GRAPH, SAS Risk Engine, SAS/ODS, SAS/ACCESS, SAS ETL, SAS/GRAPH, SAS Enterprise Guide| Python, OS, Windows, Linux, Unix| Databases: Oracle, DB2, Teradata, MySQL, SQL Server| SAS, Python, Excel| BI tools| SAS Visual Analysis: Tableau, Power BI| Cloud Platforms: AWS, AZURE, GPC

Meer person

ID : 2042
Associate Software Engineer
Education level: Bachelor's degree
Work experience level: Intermediate- (2-4 year experience)
ID : 2041
System Engineer IT
Education level: Bachelor's degree
Work experience level: Experienced- (4-7 year experience)
ID : 2040
Associate Engineer
Education level: Bachelor's degree
Work experience level: Associate- (1-2 year experience)
ID : 2039
Intern
Education level: Bachelor's degree
Work experience level: Beginner- (internship- 1 year experience)
Mis geen enkele belangrijke kennisgeving houd jezelf update
Begin met chatten!
Wij staan u graag te woord!
Hallo 👋
Kunnen we je helpen?