Practice C.5 Analysing the web with authentic IB Computer Science (CS) exam questions for both SL and HL students. This question bank mirrors Paper 1, 2, 3 structure, covering key topics like programming concepts, algorithms, and data structures. Get instant solutions, detailed explanations, and build exam confidence with questions in the style of IB examiners.
When a user requests a file from a particular website, the website uses lossy compression to send the file to the user over the internet.
Outline the functions of each of these three processes: crawling, indexing, searching.
Explain why the PageRank algorithm might discriminate against new websites.
Explain how a search engine is able to maintain an up-to-date index when the web is continually expanding.
Discuss how this use of lossy compression might affect the user's experience.
A company uses graph theory to improve its web portal structure, ensuring better connectivity between sections.
Explain the term connectivity as it applies to web design.
Outline two ways graph theory insights can improve user experience on a website.
Describe one challenge of maintaining connectivity on a website that frequently updates.
Search engines rely on web crawlers to systematically browse and index the web for relevant content.
Define a web crawler.
Explain how a web crawler uses the web graph to identify and index new pages.
Outline two limitations that web crawlers might face in attempting to index the entire web.
SocialMediaMap is a platform that clusters web pages and content based on user interests and frequently updated social tags.
Define a sub-graph in the context of web analysis.
Explain two reasons why SocialMediaMap might use sub-graphs to organize content for users.
Describe one limitation of using sub-graphs to group related web pages.
The structure of the web often follows a power law distribution, where a few popular sites receive the majority of links.
Define a power law distribution in the context of web structure.
Explain two reasons why a power law distribution might occur in web link structures.
Describe one potential drawback of a power law distribution on search engine ranking diversity.
The World Wide Web can be represented as a web graph.
The World Wide Web has expanded significantly over the last 10 years. In June 2018, there were an estimated 1600 million websites and 4100 million users. This is approximately twice the figures of January 2014.
The World Wide Web can be divided into the text web and the semantic web.
The development of technologies that underpin the World Wide Web has led to the growth of ambient intelligence and collective intelligence.
Identify two features of ambient intelligence.
Outline two reasons why there needs to be a balance between expressivity and usability on the semantic web.
Explain why a user can still navigate from one web page to another web page in the same amount of time as previously, even though the number of web pages is significantly larger.
Describe how a web graph can be used to represent the connectivity of the World Wide Web.
Evaluate the role of collective intelligence in the advancement of human knowledge.
A news aggregator uses the concept of web diameter to understand how far apart news sources are in terms of interlinking.
Define web diameter and explain its relevance in a web graph.
Describe one way the concept of web diameter could help the news aggregator refine its results.
Explain one limitation of using web diameter to analyze content on the web.
In a research study, a team of data scientists is using graph theory to analyze connectivity on the web.
Explain what is meant by connectivity in the context of the web graph.
Describe one example of how analyzing connectivity could improve website navigation.
Outline two ways in which connectivity insights from graph theory could benefit search engine algorithms.
A web search engine uses graph theory to analyze its network of interconnected web pages.
Describe the term strongly connected core (SCC) in the context of a web graph.
Outline two benefits of identifying the SCC within a web graph for search engines.
Explain one way in which the concept of web diameter helps search engines refine search results.
The structure of the web can be represented using graph theory, where nodes represent web pages and edges represent hyperlinks between them.
Define a directed graph and describe its relevance to representing the structure of the web.
Explain one benefit of representing the web as a directed graph.
Outline one limitation of using a directed graph to analyze the web.