← HOME|NOTES|Module 3 - Technology
⚡ FLASHCARDS◈ QUIZ

Module 3 - Technology

CONTENTS

3.1TECHNOLOGY BASICS AND TERMINOLOGY4fc · 3q3.1.1Hardware6fc · 4q3.1.2Operating systems4fc · 2q3.1.3Applications6fc · 4q3.1.4Software/application development8fc · 5q3.1.5General computing technology6fc · 9q3.1.6Connectivity34fc · 8q3.1.7Stability8fc · 4q3.2DATAFEEDS16fc · 10q3.3Data Distribution – systems and software6fc · 3q3.4DESKTOPS (e.g.TERMINALS)6fc · 4q3.4.1Workstations8fc · 3q3.4.2Instant messaging4fc · 3q3.4.3Transaction products4fc · 3q3.5APPLICATIONS AND ASSOCIATED TECHNOLOGY4fc · 4q3.5.1Application types14fc · 4q3.5.2Alogorithmic trading14fc · 6q3.5.3Price contributions6fc · 3q3.6DEVELOPMENTS, IMPLEMENTATION, MANAGEMENT AND SUPPORT4fc · 3q3.6.1System development12fc · 4q3.6.2Implementation6fc · 5q3.6.3Support12fc · 3q3.7ENTERPRISE AND REFERENCE DATA MANAGEMENT6fc · 3q3.7.1Basics of back office data distribution systems6fc · 3q3.7.2Types of information files4fc · 2q3.8THE CLOUD4fc · 4q3.9ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING6fc · 4q
3.1

TECHNOLOGY BASICS AND TERMINOLOGY

KEY CONCEPTS

What is the primary function of market data systems provided by companies like Refinitive and Bloomberg?

Market data systems take data input, process it through a feed handler, and distribute it effectively and efficiently to users.

Name two major providers of market data systems and their product offerings.

Refinitive (formerly Thomson Reuters/Reuters) provides TREP (Thomson Reuters Enterprise Platform) and legacy products like RFDS. Bloomberg provides Beanpipe and Bloomberg Platform products.

What was RFDS in the context of Reuters market data systems?

RFDS was the Reuters market head system created after Reuters merged with TIPCO.

What is TREP and which company offers it?

TREP (Thomson Reuters Enterprise Platform) is one of the latest iterations of market data systems offered by Refinitive.

4 FLASHCARDS3 QUIZ QS
3.1.1

Hardware

EXAM OBJECTIVES

  • Processor
  • Memory
  • Storage (disk, tape, USB/memory stick, SAN)
  • NIC (network interface card)
  • KVM (keyboard, video and mouse) switches
  • FPGA – field programmable gate array
  • Telecoms turret

KEY CONCEPTS

What is a Network Interface Card (NIC) and what is its primary function?

A Network Interface Card (NIC) is a hardware component that enables a computer to connect to and communicate with a broader network. It is the primary means by which a computer interfaces with network infrastructure such as Ethernet.

Name the three components that make up a KVM connection.

The three components of a KVM connection are: Keyboard (K), Video (V), and Mouse (M). These connections enable human interaction with a computer system.

What is the most common type of network connection used in modern computing environments?

Ethernet is the most common network connection type used in modern computing environments for connecting computers to networks.

Describe the typical connectivity pattern in modern market data scenarios involving personal computers and servers.

In modern market data scenarios, a personal computer connects to a network (typically Ethernet), which in turn connects to a series of servers that perform functions such as connecting to external data sources or performing complex calculations.

What is an FPGA and what does the acronym stand for?

FPGA stands for Field Programmable Gate Array. It is a hardware component that can be programmed and reconfigured to perform specific computational tasks.

How does a telecoms turret function as a hardware component in financial trading environments?

A telecoms turret is a specialized hardware device used in financial trading environments that integrates multiple communication channels (voice, data, etc.) to enable traders to communicate with clients, brokers, and other market participants simultaneously.

6 FLASHCARDS4 QUIZ QS
3.1.2

Operating systems

EXAM OBJECTIVES

  • Windows (various)
  • Unix, Linux
  • Mac (Apple)

KEY CONCEPTS

Name the three major categories of operating systems covered in FIA certification syllabus section 3.1.2

Windows (various versions), Unix/Linux, and Mac (Apple)

Which of these operating systems is a Unix-based system: Windows, Linux, or Mac?

Both Linux and Mac are Unix-based systems. Windows is not Unix-based.

What are the key differences between Windows and Linux operating systems in terms of architecture?

Windows is proprietary and closed-source with a GUI-focused design. Linux is open-source, Unix-based, and supports both GUI and command-line interfaces with greater flexibility for server environments.

Mac operating systems are built on which Unix variant?

Mac operating systems (macOS) are built on BSD (Berkeley Software Distribution), a Unix variant, combined with Apple's proprietary technologies.

4 FLASHCARDS2 QUIZ QS
3.1.3

Applications

EXAM OBJECTIVES

  • Spreadsheet (incl. MS Excel, Functions, DDE, RTD, Macros, VBA)
  • Databases (Incl. SQL and Database Queries)
  • Custom applications (in-house, vendor)
  • Browser, messaging, email

KEY CONCEPTS

What are the primary functions of market data systems like those provided by Refinitiv and Bloomberg?

Market data systems ingest data from various sources, process it through a feed handler, and distribute it efficiently and effectively to end users and applications.

Name two major providers of market data platforms and give an example of their product offerings.

Refinitiv (formerly Thomson Reuters/Reuters) provides TREP (Thomson Reuters Enterprise Platform) and Bloomberg provides Beanpipe and Bloomberg Platform for data distribution.

How do spreadsheet applications like MS Excel integrate with real-time data feeds in financial systems?

Spreadsheets can receive live market data through DDE (Dynamic Data Exchange), RTD (Real-Time Data) functions, or custom VBA macros that connect to external data sources and market data providers.

What role do databases and SQL queries play in financial applications?

Databases store historical and real-time financial data, and SQL queries enable efficient retrieval, filtering, and analysis of large datasets for reporting and decision-making purposes.

What is the primary difference between vendor-provided market data systems and custom in-house applications?

Vendor systems (Bloomberg, Refinitiv) are pre-built, standardized platforms with established infrastructure; custom in-house applications are developed internally to meet specific organizational requirements and integrate with existing systems.

How do browsers, messaging, and email applications interface with financial data distribution systems?

These applications receive and display market data alerts, notifications, and reports from data feeds; they enable real-time communication of critical market information to end users across the organization.

6 FLASHCARDS4 QUIZ QS
3.1.4

Software/application development

EXAM OBJECTIVES

  • What is software/firmware
  • Programming Languages (C, C#, .NET, Java, Perl, Python)
  • API (Application Programmatic Interface)
  • FPGA and hardware acceleration
  • Hadoop (Big Data)

KEY CONCEPTS

Define software and firmware in the context of application development.

Software refers to applications and programs that run on systems, typically developed using programming languages. Firmware is low-level software that controls hardware devices and is often written in C or C++, providing the interface between hardware and higher-level applications.

What is the primary difference between traditional languages like C/C++ and modern languages like Python in application development?

C/C++ are older, compiled languages known for high performance and system-level access, commonly used in legacy and embedded systems. Python is a modern, interpreted language favored for rapid development, data analysis, and machine learning due to its simplicity and extensive libraries.

List three programming languages commonly used in modern application development and identify their primary use cases.

Python: data analysis, machine learning, scripting; Java: enterprise applications, cross-platform development; C#/.NET: Windows-based enterprise applications. Other examples include C/C++ (system-level, high-performance), and Perl (scripting, text processing).

What is an Application Programming Interface (API) and what purpose does it serve in software development?

An API is a set of rules and tools that allows software developers to connect with other software applications or access data. It defines the methods and data formats that applications can use to communicate and interact with each other or external systems.

Explain how APIs enable integration between different software systems.

APIs provide standardized interfaces that allow different applications to request and exchange data or functionality. They establish protocols and rules that developers use to build connections, enabling software to interface with other applications or data sources without needing direct access to underlying code.

What is an FPGA and what advantage does it provide in application development and hardware acceleration?

An FPGA (Field-Programmable Gate Array) is reconfigurable hardware that can be programmed to perform specific computational tasks. It provides hardware acceleration by executing algorithms directly in hardware, offering significant performance improvements for computationally intensive applications compared to software-based solutions.

8 FLASHCARDS5 QUIZ QS
3.1.5

General computing technology

EXAM OBJECTIVES

  • The Internet (the world wide web)
  • Distributed computing/systems
  • Grid computing
  • Cloud computing (the cloud)
  • Big data
  • Artifical intelligence (AI)
  • Machine learning (ML)
  • Quantum computing
  • Natural language programming
  • Quantum computing
  • Deployed/hosted
  • Blockchain - distributed ledger
  • Cyber security

KEY CONCEPTS

What is the key distinction between the Internet and the World Wide Web?

The Internet is the global connectivity infrastructure of computers worldwide that enables communication. The World Wide Web is software designed to locate relevant servers and websites and enable access to them through browsers.

Define the Internet in the context of general computing technology.

The Internet is the worldwide network of interconnected computers that provides the infrastructure for data transmission and communication across geographically dispersed locations.

What is the fundamental purpose of the World Wide Web?

The World Wide Web is software infrastructure designed to enable users to find relevant servers and websites and access them through web browsers.

How does the World Wide Web relate to the Internet?

The World Wide Web operates on top of the Internet infrastructure; it uses the Internet's connectivity to deliver web-based services and enable access to websites.

Define distributed computing and distributed systems.

Distributed computing involves multiple computers or servers connected together to work collaboratively, sharing processing tasks and resources across different locations within an organization or network.

What is the primary contrast between personal computing and distributed systems?

Personal computing operates independently with minimal external connections, while distributed systems integrate multiple servers and services across an organization to work collaboratively and share resources.

6 FLASHCARDS9 QUIZ QS
3.1.6

Connectivity

EXAM OBJECTIVES

  • Networks
  • TCP/IP
  • Point to point
  • Multicast
  • Broadcast
  • Transmission medium (copper, fiber optics, satellite, microwave)
  • Network/communications hardware (hubs, switches, routers, modems, firewalls)
  • Telco providers
  • Leased lines
  • VPN
  • Internet
  • Extranet
  • Middleware
  • Bandwidth, throughput
  • Hosting, data center
  • Co-Location
  • Managed service
  • Proximity

KEY CONCEPTS

What is the primary difference between TCP/IP and other network protocols?

TCP/IP is a layered protocol suite that provides reliable, connection-oriented (TCP) and connectionless (UDP) communication, forming the foundation of modern internet communication.

Name the four layers of the TCP/IP model.

Application layer, Transport layer, Internet layer, and Link layer (or Network Interface layer).

In point-to-point communication, what is the primary characteristic of the connection?

A direct link between two devices where data is transmitted from one sender to one receiver without intermediate nodes.

What is the key difference between point-to-point and multicast transmission?

Point-to-point is one-to-one communication, while multicast sends data from one source to multiple specific recipients in a group.

Define broadcast transmission and provide an example.

Broadcast sends data from one sender to all devices on a network segment. Example: ARP (Address Resolution Protocol) requests sent to all devices on a local network.

What are the three main types of network transmission methods?

Unicast (one-to-one), multicast (one-to-many specific recipients), and broadcast (one-to-all on a network segment).

34 FLASHCARDS8 QUIZ QS
3.1.7

Stability

EXAM OBJECTIVES

  • Resilience and fail-over
  • Latency
  • Scalability
  • Message rates

KEY CONCEPTS

What is the primary difference between resilience and failover in a trading system?

Resilience is the broader concept of maintaining maximum uptime by quickly recovering when any part of the system fails. Failover is a specific subset of resilience where processes automatically switch from a failed machine to a mirroring machine.

In the context of failover, what does 'hot standby' mean?

Hot standby refers to a standby system that is actively running and ready to immediately take over operations if the primary system fails, providing seamless failover.

Why is resilience and failover critical in financial markets?

Financial markets require as close to 100% uptime as possible. Resilience and failover ensure that if connectivity is lost, an application fails, or hardware malfunctions, the system automatically recovers and continues operating with minimal interruption.

What is the distinction between hot standby and cold standby in failover scenarios?

Hot standby is a mirroring system that is already running and can take over immediately upon primary system failure. Cold standby is an inactive system that must be started and initialized when failover occurs, resulting in slower recovery time.

How does latency affect the effectiveness of failover mechanisms?

Lower latency in failover detection and activation ensures quicker recovery time, minimizing system downtime and data loss during the transition from primary to standby systems.

What role does scalability play in maintaining stability during high message rates?

Scalability ensures the system can handle increased message rates without degradation. A scalable system can distribute load across multiple resources, preventing bottlenecks and maintaining stability under peak trading volumes.

8 FLASHCARDS4 QUIZ QS
3.2

DATAFEEDS

EXAM OBJECTIVES

  • How a datafeed is typically delivered to a client
  • What a “consolidated” datafeed is
  • What a “direct” datafeed is
  • How datafeeds are processed The following terminologies as they relate to data feeds:
  • Throttling, pulsed, intervalized
  • Snapped, streaming, delayed, real-time, EOD, conflated
  • Push/pull technologies
  • Pub/Sub mechanisms
  • Protocols (FIX, FAST, ITCH, XML)
  • Aggregated/consolidated
  • Direct
  • Hybrid or co-located or hosted solutions

KEY CONCEPTS

What is the primary source of data feeds in the market data industry?

Large data vendors or exchanges that provide a large flow of market data through dedicated pipes to clients

How is a typical datafeed delivered to a client?

Through large pipes/flows from data vendors or exchanges, delivered over networks to the client's local area network or PC

What is a consolidated datafeed?

A datafeed that aggregates data from multiple sources or exchanges into a single unified feed for the client

What is a direct datafeed?

A datafeed received directly from a single exchange or data vendor without aggregation from multiple sources

Define throttling in the context of datafeeds.

A technique to limit the rate at which data is sent to control bandwidth and processing load

What does 'pulsed' mean when describing datafeed delivery?

Data delivered in periodic bursts or pulses rather than in a continuous stream

16 FLASHCARDS10 QUIZ QS
3.3

Data Distribution – systems and software

EXAM OBJECTIVES

  • What is a market data distribution system (MDDS)?
  • What application permissioning and user entitlements is?
  • What is broadly understood by the phrase EDM system?

KEY CONCEPTS

What is a market data distribution system (MDDS) and what is its primary purpose?

An MDDS is a system designed to enable a firm to distribute market data that has been ingested from external data feeds (consolidated vendors, exchanges, or internally generated) to multiple users and applications in real-time.

What are the three sources from which market data can be ingested into an MDDS?

Market data can come from: (1) consolidated data vendors, (2) direct feeds from exchanges, and (3) internal data generated by the firm itself.

What types of recipients can receive distributed market data in an MDDS?

Recipients include: (1) human users at workstations or terminals, (2) central applications providing functionality, and (3) local applications on personal computers.

What is application permissioning and user entitlements in the context of data distribution systems?

Application permissioning and user entitlements refer to the controls and access rights that determine which users and applications are authorized to receive and access specific market data within a distribution system.

What is an EDM system in broadly understood financial technology terms?

An EDM (Enterprise Data Management) system is a comprehensive system responsible for managing data flows, including ingestion from external feeds, processing, and distribution of market data across an organization to authorized users and applications.

What is the role of a feed handler in a market data distribution system?

A feed handler is software that connects to a data feed, interfaces with it, and handles the processing and management of incoming market data from external sources.

6 FLASHCARDS3 QUIZ QS
3.4

DESKTOPS (e.g.TERMINALS)

KEY CONCEPTS

Name two major providers of market data terminal systems in the financial industry.

Refinitiv (formerly Thomson Reuters) and Bloomberg are the two largest providers of market data terminal systems.

What is the current flagship enterprise platform offering from Refinitiv?

TREP (Thomson Reuters Enterprise Platform) is one of the latest iterations of Refinitiv's market data platform offerings.

What historical market data system did Reuters develop that later became known as the Reuters market head system?

Reuters developed systems including Triarch, and after merging with TIPCO, created RFDS (Reuters market head system).

What is the fundamental function performed by all market data terminal systems regardless of provider?

All market data terminals take data input through a feed handler, then distribute it effectively and efficiently to end users.

Name one of Bloomberg's product lines in the market data space.

Beanpipe is one of Bloomberg's product lines for data feeds, and the Bloomberg platform is another offering in this space.

Are you required to memorize specific desktop terminal product offerings for the FIA exam?

No, you will not be tested on specific product offerings. Understanding the fundamental functions and general providers is sufficient.

6 FLASHCARDS4 QUIZ QS
3.4.1

Workstations

EXAM OBJECTIVES

  • The basics of desktop workstations
  • The role of Microsoft Windows: DDE, OLE and RTD;
  • How market data applications connect and communicate with the server;
  • The difference between fat and thin client technology.
  • The key workstation offerings in the market.
  • The role of profiling
  • High versus mid tier solutions
  • Understand the main vendors and their different offerings
  • Have a basic understanding of mobile and handheld devices and how data can be distributed and displayed to such devices.

KEY CONCEPTS

What are the two main types of client architectures used in market data workstations?

Fat client and thin client. Fat client has significant processing power and software installed locally on the PC, while thin client relies more on central servers for processing and data management.

Name three Microsoft technologies used for workstation data communication and integration.

DDE (Dynamic Data Exchange), OLE (Object Linking and Embedding), and RTD (Real-Time Data). These enable data flow and integration between applications on the workstation.

What is the critical role of network connectivity in a market data workstation?

Network connectivity is vital for interfacing between the workstation and central systems, central services, and data providers. It enables the reception and transmission of market data feeds from multiple sources.

Describe how market data applications communicate with servers in a workstation environment.

Market data applications use network connectivity and embedded software (invisible to users) to handle data traffic and flow. The software receives information from data feeds and interfaces with central services and data providers.

What is the primary function of software running on a market data workstation that users do not directly interact with?

It handles all data traffic and data flow between the workstation and external data feeds/central systems, managing the network interface and connectivity aspects of the system.

What is the most important visible capability of workstation software in a market data system?

Display of market data information. Various software programs from different providers compete to present the received information clearly and effectively to users.

8 FLASHCARDS3 QUIZ QS
3.4.2

Instant messaging

KEY CONCEPTS

What is the primary relationship between instant messaging and market data provision?

Instant messaging or chat is closely associated with market data access and is often provided alongside market data offerings by major market data providers, though it is not strictly a market data concept itself.

Which department typically manages instant messaging in a financial institution?

The market data team in an organization typically controls and manages instant messaging or chat functionality, even though it is not strictly a market data function.

Why is instant messaging contentious from a cloud compliance perspective in financial institutions?

Financial institutions need to track, monitor, record all communications, and maintain audit trails. This compliance requirement makes instant messaging management complex as systems must interface with bank system management functions to ensure proper oversight and record-keeping.

What key capability must instant messaging systems have in regulated financial institutions?

Instant messaging systems must have the ability to interface with the institution's system management functions to enable tracking, monitoring, recording, and audit trail creation of all communications for compliance purposes.

4 FLASHCARDS3 QUIZ QS
3.4.3

Transaction products

KEY CONCEPTS

What are the primary functions of market data systems in transaction products?

Market data systems take data input, process it through a feed handler, and distribute it effectively and efficiently to end users.

Name two major providers of market data systems in the financial industry.

Bloomberg and Refinitive are the two largest providers of market data systems. Other providers also exist in this space.

What is the evolution of Thomson Reuters' market data platform products?

Reuters' product evolution includes: Reuters (original), Triarch, RFDS (Reuters market head system), and TREP (Thomson Reuters Enterprise Platform), which is one of their latest iterations.

What should a candidate focus on when learning about different market data system providers?

Candidates should understand the fundamental functions all providers perform rather than memorizing specific product offerings, as all systems essentially take in data, process it through feed handlers, and distribute it efficiently.

4 FLASHCARDS3 QUIZ QS
3.5

APPLICATIONS AND ASSOCIATED TECHNOLOGY

KEY CONCEPTS

Name two major providers of market data systems and their current or former product names.

Refinitive (formerly Thomson Reuters, formerly Reuters) with products including TREP (Thomson Reuters Enterprise Platform), RFDS, and Triarch. Bloomberg with products including Beanpipe and Bloomberg platforms.

What is the fundamental function that all market data system providers perform, regardless of their specific product offerings?

All market data system providers take data into a feed handler, then distribute it effectively and efficiently to end users.

What major merger created the Reuters product called RFDS?

The merger of Reuters with TIPCO created the product RFDS (Reuters market data system).

Why is it not necessary to memorize specific individual market data system product offerings for this exam?

Because all market data system providers fundamentally perform the same core functions of data intake, feed handling, and efficient distribution, regardless of their specific product names or offerings.

4 FLASHCARDS4 QUIZ QS
3.5.1

Application types

EXAM OBJECTIVES

  • Charting and technical analysis
  • Various mathematical functions
  • Algorithmic trading
  • Risk management
  • Trading systems
  • OMS & EMS
  • Smart order routing
  • Pricing systems

KEY CONCEPTS

What is the primary purpose of charting and technical analysis applications in a trading environment?

To visualize market data and identify patterns, trends, and price movements that inform trading decisions and help traders recognize potential entry and exit points.

Name three common mathematical functions used in technical analysis for market data interpretation.

Moving averages (simple and exponential), standard deviation for volatility measurement, and Relative Strength Index (RSI) for momentum analysis.

What role do mathematical functions play in algorithmic trading systems?

Mathematical functions process market data to generate automated trading signals, calculate position sizing, evaluate risk metrics, and execute trades based on predefined quantitative rules.

How do algorithmic trading systems utilize market data to make trading decisions?

They apply mathematical models and algorithms to real-time and historical market data to identify patterns, calculate probabilities, and execute trades automatically without human intervention.

What are the three main components of an effective risk management system in trading applications?

Position sizing controls, stop-loss mechanisms to limit losses, and portfolio-level exposure limits to manage overall risk across multiple positions.

How does risk management data influence order generation in trading systems?

Risk management parameters set maximum position sizes, drawdown limits, and exposure thresholds that constrain order quantity and type, preventing orders that would violate risk policies.

14 FLASHCARDS4 QUIZ QS
3.5.2

Alogorithmic trading

EXAM OBJECTIVES

  • Low latency feed and distribution systems
  • Messaging systems
  • Complex event processing (CEP) and its relevance
  • The reasons for tick capture systems
  • The reasons for latency metrics and the existing technologies being marketed
  • The benefits and challenges of co-location and proximity hosting

KEY CONCEPTS

What is the primary function of a low latency feed and distribution system in algorithmic trading?

To enable rapid ingestion of market data from external feeds (consolidated vendors, exchanges, or internal sources) and distribute real-time updating data to multiple users and applications with minimal delay.

Name the three to four broad elements that must be considered in market data distribution systems.

Feed handlers (software connecting to data feeds), data ingestion mechanisms, distribution infrastructure, and user/application endpoints (workstations, central applications, or local personal computer applications).

What role do messaging systems play in algorithmic trading environments?

Messaging systems facilitate the transport and delivery of market data and trading signals between distributed components, enabling asynchronous communication and data dissemination across the trading infrastructure.

How do messaging systems differ from direct feed connections in market data distribution?

Messaging systems provide decoupled, asynchronous communication with queuing and routing capabilities, whereas direct feed connections are typically point-to-point synchronous links between a data source and consumer.

What is Complex Event Processing (CEP) and why is it relevant to algorithmic trading?

CEP is the technology for detecting patterns and relationships across multiple market data events in real-time; it is relevant because it enables algorithmic traders to identify trading signals and market conditions by processing streams of market data and triggering automated responses.

In what scenarios is Complex Event Processing (CEP) critical for trading strategies?

CEP is critical when strategies require detection of multi-event patterns, correlation of data across multiple feeds, or real-time aggregation of market conditions that span time windows and multiple data sources.

14 FLASHCARDS6 QUIZ QS
3.5.3

Price contributions

EXAM OBJECTIVES

  • Spreadsheet publishing
  • Multi-vendor contribution systems
  • Vendor contribution protocols

KEY CONCEPTS

What is the primary advantage of using multi-vendor contribution systems compared to sending data to each vendor individually?

Multi-vendor contribution systems allow you to send a single item of data or price once to an internal contribution server, which then automatically distributes that information to multiple data vendors simultaneously, eliminating the need to send data multiple times to each vendor separately.

How does spreadsheet publishing relate to price contribution workflows in multi-vendor environments?

Spreadsheet publishing enables users to consolidate and format price contributions that can be sent through a multi-contribution server to multiple vendors at once, rather than manually publishing to each vendor's system individually.

What is the functional role of a multi-contribution server in a price contribution system?

A multi-contribution server acts as a central hub that receives price data from an internal application and distributes that single submission to multiple data vendors simultaneously, streamlining the contribution process.

Why is understanding vendor contribution protocols essential when implementing a contribution server?

The contribution server running on your site must know and understand the specific protocols and language requirements that each data vendor requires to receive and process price information correctly, similar to how a feed handler must understand data feed protocols.

What is the relationship between a contribution server and vendor protocols, analogous to feed handlers?

Just as a feed handler must understand the protocols and interface language of a data feed to properly receive information, a contribution server must understand the protocols required by each data vendor to properly send and deliver price contribution data.

What problem does a multi-contribution service solve regarding the frequency of vendor data submissions?

A multi-contribution service eliminates the time-consuming and complicated process of submitting the same price data twice or three times to different vendors by enabling a single submission that distributes to all vendors automatically.

6 FLASHCARDS3 QUIZ QS
3.6

DEVELOPMENTS, IMPLEMENTATION, MANAGEMENT AND SUPPORT

KEY CONCEPTS

Name two major providers of market data distribution systems and their product evolution.

Reuters (now Thomson Reuters, now Refinitiv) with products including Triarch, RFDS (Reuters market data system), and TREP (Thomson Reuters Enterprise Platform). Bloomberg with product lines including Beanpipe for data feeds and Bloomberg platforms.

What is the fundamental function that all market data distribution systems perform, regardless of the specific provider?

They take data in through a feed handler, process it, and then distribute it effectively and efficiently to end users.

Describe the evolution of Reuters' market data systems from earliest to most recent iteration.

Reuters products included Triarch (older iteration), RFDS (Reuters market data system), and TREP (Thomson Reuters Enterprise Platform, one of the latest iterations following the Reuters-TIPCO merger).

Why would you NOT be expected to memorize specific market data system product offerings for this exam?

Because all market data distribution systems fundamentally perform the same core function regardless of provider - they ingest data through feed handlers and distribute it effectively and efficiently. The specific product names and offerings are less important than understanding the underlying functionality.

4 FLASHCARDS3 QUIZ QS
3.6.1

System development

EXAM OBJECTIVES

  • Basic understanding of programming languages and terminology (code, run-time, API, UAT, version control, etc.);
  • Development environments (dev, QA, production);
  • Handling data (data integrity, support of entitlements and usage tracking, implications of derived data and redistribution);
  • Importance of test environments.

KEY CONCEPTS

What is code in the context of software development?

Code is the set of instructions generated by a programmer using a programming language that a computer can understand and execute.

Name three examples of programming languages used in software development.

Examples include C, C++, Perl, Python, and Java. These are languages in which programmers write applications that run on machines.

What does API stand for and what is its purpose?

API stands for Application Programming Interface. It allows different software applications to communicate and interact with each other.

What is an SDK and how does it relate to an API?

SDK stands for Software Development Kit. It is a related concept to an API and provides tools and libraries that developers use to build applications.

What are the three primary development environments and their purposes?

Dev (development) for coding, QA (quality assurance) for testing, and Production for live end-user use. Each environment is separate to ensure stability and quality.

Why are test environments important in software development?

Test environments allow for validation of code changes, identification of bugs, and verification of system functionality before deployment to production, reducing risk and ensuring quality.

12 FLASHCARDS4 QUIZ QS
3.6.2

Implementation

EXAM OBJECTIVES

  • How systems are tested and deployed
  • The techniques used to package software for mass deployment
  • Software compatibility issues and resolution
  • Importance of change management processes
  • Effective communication and planned changes - including leadtime
  • The role of “project management”
  • The importance of being able to “fall back” to an earlier version
  • The importance of backward compatibility

KEY CONCEPTS

What is the primary purpose of testing software in a development environment before deployment?

To ensure the software works correctly without affecting business operations in the production environment. Testing identifies and resolves issues before the application goes live.

Describe the relationship between development stage testing and runtime environment deployment.

Software progresses through development stages with various testing phases. Once approved after passing all tests, it is deployed to the runtime environment for actual business use.

What does UAT stand for and when does it occur in the deployment process?

UAT stands for User Acceptance Testing. It occurs as one of the testing phases where real human users test the application in a real-world scenario to verify functionality and approve it for use.

What is the critical outcome that must be achieved during user acceptance testing before deployment?

Users must be satisfied and comfortable with the application, formally approving it by signing off that it meets their requirements and is ready for production deployment.

Name two different types of testing that may be performed on software during the testing phase.

1) Functional testing - checking that the program works correctly; 2) Data testing - verifying the program works effectively with actual data.

Why is it important to have different testing phases rather than deploying directly to production?

Multiple testing phases help identify and resolve issues before they affect business operations, reducing risk and ensuring software quality and compatibility in the production environment.

6 FLASHCARDS5 QUIZ QS
3.6.3

Support

EXAM OBJECTIVES

  • Importance of incident and problem management within market data
  • Need for capacity management as it relates to market data
  • Understand client (i.e. end users) base and impact of systems failure (availability management)
  • Impact of systems recovery to end users and applications
  • Business continuity management and disaster recovery planning
  • System monitoring and alerting, including latency

KEY CONCEPTS

What is the primary purpose of incident and problem management in market data systems?

To identify, document, and resolve issues affecting market data delivery to minimize disruption to trading and transaction activities, ensuring data reliability and system performance.

How does problem management differ from incident management in the context of market data?

Incident management addresses immediate issues and their resolution, while problem management focuses on identifying root causes and implementing permanent solutions to prevent recurring incidents in market data systems.

Why is capacity management critical for market data systems?

Capacity management ensures that systems have sufficient resources to handle real-time market data flows without degradation, preventing performance bottlenecks during peak trading periods and maintaining service quality.

What factors must be considered when planning capacity for market data infrastructure?

Network bandwidth, middleware processing power, storage requirements, and expected data volumes must be assessed to ensure the system can sustain real-time market data distribution and prevent latency issues.

How does understanding the client base impact availability management for market data systems?

Understanding end-user dependencies and critical trading workflows allows organizations to prioritize system resilience, redundancy, and failover mechanisms to minimize impact when systems fail.

What is the business impact of market data system unavailability on end users?

System unavailability prevents traders from accessing real-time market information, blocking informed trading decisions and transactions, which can result in missed opportunities, financial losses, and competitive disadvantage.

12 FLASHCARDS3 QUIZ QS
3.7

ENTERPRISE AND REFERENCE DATA MANAGEMENT

KEY CONCEPTS

What is the primary function of enterprise data management systems in financial markets?

Enterprise data management systems take market data input, process it through feed handlers, and distribute it effectively and efficiently to end users and systems.

Name two major providers of market data management systems in the financial industry.

Refinitive (formerly Thomson Reuters, which acquired Reuters) and Bloomberg are two of the largest providers of market data management systems.

What was RFDS and which company provided it?

RFDS was the Reuters market data system created when Reuters merged with TIPCO. It was a market data management platform from Reuters, which later became Thomson Reuters and is now part of Refinitive.

Identify two product offerings from Refinitive's portfolio of market data systems.

Refinitive offers TREP (Thomson Reuters Enterprise Platform) and previously offered products such as Triarch and RFDS as iterations of their market data management systems.

What Bloomberg product line is used for data feed distribution?

Beanpipe is one of Bloomberg's product lines focused on data feeds and market data distribution.

What is a fundamental characteristic shared by all major enterprise data management providers?

All major providers fundamentally perform the same core function: they ingest market data, process it through feed handlers, and distribute it effectively and efficiently across financial systems.

6 FLASHCARDS3 QUIZ QS
3.7.1

Basics of back office data distribution systems

EXAM OBJECTIVES

  • What a reference data distribution system is
  • Understand how files are transferred via FTP, push/pull, secure FTP, automated FTP extraction delivery.

KEY CONCEPTS

What is a reference data distribution system?

A system designed to enable a firm to distribute reference data that has been ingested from external data feeds, consolidated data vendors, direct exchange feeds, or internally generated data to multiple users and applications across the organization.

Name three sources from which a reference data distribution system can receive data.

1) Consolidated data vendors, 2) Direct feeds from exchanges, 3) Internally generated data within the firm.

What is the primary function of a feed handler in a market data distribution system?

To connect with a data feed and handle that incoming data by interfacing with the data source and managing the flow of real-time updating data into the organization.

Identify two types of end users that receive distributed data from a reference data distribution system.

1) Human users sitting at workstations or terminals, 2) Applications, which can be either central applications providing specific functionality or local applications on personal computers.

What are four methods of file transfer in back office data distribution systems?

1) FTP (File Transfer Protocol), 2) Push mechanisms, 3) Pull mechanisms, 4) Secure FTP with automated extraction and delivery.

Explain the difference between push and pull file transfer methods.

Push: The source system initiates the transfer and sends files to the destination. Pull: The destination system initiates the request and retrieves files from the source system.

6 FLASHCARDS3 QUIZ QS
3.7.2

Types of information files

KEY CONCEPTS

What is the primary characteristic of a data feed in the market data industry context?

A data feed is a large flow of data from a data vendor or exchange that provides structured, continuous streams of market information to an enterprise.

How does the technical definition of a data feed differ from its common usage in market data?

Technically, any flow of data over a network is a data feed. In common market data usage, a data feed specifically refers to large pipes of data from established data vendors or exchanges, not just any local network data transfer.

What role do data feeds play in enterprise data management?

Data feeds provide a structured and efficient mechanism for accessing and distributing large volumes of reference data and market information across an enterprise in an organized and highly accessible manner.

What sources typically provide data feeds in the financial market data industry?

Data feeds are typically provided by large data vendors or exchanges that supply continuous, large-scale flows of market and reference data to subscribing organizations.

4 FLASHCARDS2 QUIZ QS
3.8

THE CLOUD

EXAM OBJECTIVES

  • The basics of what 'the cloud' is
  • Differences between public, private and hybrid
  • Major western providers
  • APAC providers
  • Market data in the cloud

KEY CONCEPTS

What are the three main variations of cloud deployment models?

Public cloud, private cloud, and hybrid cloud

Define public cloud and provide an example of a major provider.

Public cloud is a shared service where any organization can subscribe. Multiple companies' data and applications run on the same physical infrastructure. Examples include AWS, Google Cloud, and Microsoft Azure.

What is a private cloud?

A private cloud is a remote data center with dedicated machines and software accessed off-premises, connected via public internet or direct private lease lines, providing exclusive use rather than shared resources.

How does a private cloud differ from a public cloud in terms of resource allocation?

In public cloud, you share physical machines and infrastructure with other companies. In private cloud, you have dedicated infrastructure not shared with other organizations.

4 FLASHCARDS4 QUIZ QS
3.9

ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

EXAM OBJECTIVES

  • What are the broad concepts?
  • How do cloud providers fit in this area?
  • Where would Natural Language Processing (NLP) play a role? Not started 365 days remaining Practical exercises You must begin watching the videos before you can start the quiz The following course materials are available to download: Download Summary notes - Technology Module (FIA Syllabus v4.0) Home How it works

KEY CONCEPTS

What are the broad concepts of Artificial Intelligence and Machine Learning in cloud computing?

AI and ML are cloud-based services that enable automated decision-making and pattern recognition. Broad concepts include supervised learning, unsupervised learning, deep learning, and predictive analytics delivered as cloud services.

Why is avoiding vendor lock-in important when developing AI/ML applications on cloud platforms?

Vendor lock-in restricts flexibility to switch between cloud providers. Applications should be designed with portability in mind to prevent dependency on proprietary AI/ML services from a single vendor, maintaining multi-cloud strategy options.

How do major cloud providers support AI and Machine Learning capabilities?

Leading cloud providers (AWS, Azure, Google Cloud) offer managed AI/ML services including pre-built models, training platforms, data processing, and MLOps tools. Services are accessible via APIs and require subscription-based payment models.

Name three major cloud service providers offering AI/ML solutions and identify which are non-Western players.

Western providers: AWS, Microsoft Azure, Google Cloud. Non-Western providers: Alibaba Cloud and Tencent Cloud from China. These Chinese providers are significant in the Asia Pacific region.

What role does Natural Language Processing (NLP) play in cloud-based AI services?

NLP enables cloud services to process, analyze, and understand human language. Applications include chatbots, sentiment analysis, language translation, text classification, and voice recognition services accessible through cloud provider APIs.

Which NLP use cases are commonly delivered as cloud services by providers?

Common cloud-delivered NLP use cases include conversational AI assistants, machine translation, entity recognition, sentiment analysis, automated content classification, and speech-to-text transcription services.

6 FLASHCARDS4 QUIZ QS