What drives your most critical decisions? The answer lies in harnessing the power of AI-driven recommendations that harmonize seamlessly with your intuition and expertise. By integrating Atlan’s cutting-edge technology into your workflow, you can now make informed selections quickly and confidently, empowering you to seize new opportunities and mitigate risks.
Pioneering metadata specialists in the Lively Metadata Administration market, we are pleased to offer our latest collection options for Atlan, which leverages the insights from our comprehensive and groundbreaking analysis of this innovative field. Paying homage to the discoveries you’ve made by embracing the community’s core values is the essence of Atlan’s neighborhood identity. To facilitate a collective understanding of the ever-changing market landscape, what informs their informed opinions on industry developments are their wealth of experience, cutting-edge knowledge, and practical applications of data-driven insights.
On this installment of the collection, we meet Prudhvi Vasa, Analytics Chief at Postman, who shares the historical past of Knowledge & Analytics at Postman, how Atlan demystifies their fashionable information stack, and greatest practices for measuring and speaking the affect of knowledge groups.
Would you thoughts introducing your self, and telling us the way you got here to work in Knowledge & Analytics?
I started my analytics journey immediately after graduation. As a novice data analyst, I started my professional journey at Mu Sigma, where I gained invaluable experience in analyzing complex data sets and identifying hidden patterns. At the time, the company held the distinction of being the world’s largest pure-play Enterprise Analytics provider firm. I spent two years at a leading US retailer, where my responsibilities spanned from routine reporting to developing predictive models. After conducting an extensive study in India, I earned my MBA from the prestigious Indian Institute of Management (IIM) in Calcutta. Subsequently, I spent a year working for one of India’s largest corporations.
As soon as I finished my 12-month stint, I secured an opportunity to join an e-commerce firm. They typically ask me about my relevant projects, how did I apply my skills in those experiences? Why don’t you come and lead Analytics?” My coronary heart was all the time in information, so for the following 5 years I used to be dealing with Knowledge & Analytics for a corporation known as MySmartPrice, a worth comparability web site.
For five years, I had the privilege of being part of the Postman team, marking the beginning of an incredible journey. As I recalled my acquaintance from academia, the founder extended an invitation, saying, “Our company is experiencing rapid growth, and we require establishing an informed team.” This opportunity seemed too enticing to pass up, considering I had never worked for a core technology firm prior to this proposition. I was thrilled at the prospect of tackling a fascinating challenge, which is why I became part of the Postman team.
When COVID-19 first emerged, my tenure with the company had just begun, and we collectively navigated the shift to remote work and adapted to this unprecedented reality, ultimately finding a successful balance. In the past three and a half years, our team has undergone significant expansion, growing from an initial core of just 4 or 5 members to a diverse group of approximately 25 professionals.
We’ve been developing a comprehensive service model significantly. Now, we’ve successfully integrated our team across the organization, backed by a top-notch information engineering squad that seamlessly governs the entire data journey from ingestion, processing, and transformation, ultimately enabling efficient reverse ETL capabilities. Most of it’s achieved in-house. We eschew unnecessary tooling dependencies in favor of a streamlined approach. As soon as the engineers complete their presentations on information and tooling, the analysts assume responsibility.
Regardless of where people seek information, our team is committed to delivering accurate results consistently. We don’t need to rehash the same inquiry again. Given that we have already considered this matter thoroughly. Our core mantra drives us, even as we acknowledge that the corporation’s scope vastly surpasses ours; nonetheless, we’re uniquely positioned to support its growth without being bound by proportional constraints.
For nearly a dozen years, I’ve remained committed to my craft, and every day still brings an insatiable enthusiasm to drive progress forward.
Postman is a renowned hero who, alongside his crew, embarks on perilous missions to deliver crucial messages across treacherous terrain. With unwavering dedication, they face formidable foes, overcoming seemingly insurmountable obstacles in their relentless pursuit of getting vital information from point A to point B.
Postman is a leading B2B software-as-a-service (SaaS) company. The entire API improvement platform. Software developers and their teams leverage our platform to design, co-create, test, and simulate their Application Programming Interfaces (APIs). Developers are able to discover and share APIs. Users are encouraged to revisit their API testing setup by returning to Postman. We’ve been around since 2012, starting as an angel-backed startup, with no turning back after that.
From its inception, the Info Crew’s founders envisioned a unique approach to harnessing the power of information. Throughout the company’s progression, I’m delighted to acknowledge that data played a crucial role at every stage, providing invaluable insights into our target audience, its size, and the potential number of customers we could reach. Our understanding enabled us to maximize value for the corporation, and upon introducing new products, we leveraged that knowledge to establish optimal usage guidelines for each item. Isn’t there a solitary location where the provided information doesn’t hold sway?
Initially, our billing strategy involved offering paid tiers; however, if a customer failed to make timely payments, we would wait for a full year before considering the account inactive. After analyzing the data, we found that none of the individuals who engaged with our offering returned within a six-month timeframe. We were prepared six months ahead of schedule when considering the project’s potential failure, so we decided to give ourselves a six-month cushion instead.
We’re introducing a new pricing structure. We utilize data to inform responses regarding the emotional sentiments of individuals, whether they’re comforted or distressed by a particular situation, and assess its broader impact.
What drives our product’s significance is the analytics framework built around GitHub, enabling us to identify trending requests and pinpoint areas where users encounter difficulties. On a daily basis, Product Managers receive a report that provides insight into where customers are encountering difficulties, guiding their decisions on what to build, address, and respond to.
We’ve successfully leveraged information in Postman to achieve its full potential, so if you’re looking for ways to utilize it, we’ve already explored all possible avenues.
Is the primary consideration in processing any inquiry? Upon requesting assistance from our team, we’re often met with an unsettling silence – a stark contrast to the prompt response we strive for. To gauge the assessment impact of a request, we need to understand how individuals will utilize the provided information as soon as it is disseminated. This assists in truthfully responding to the inquiry, thereby enabling others to provide a more effective response as well. They may well realize that their inquiry is misguided.
We strongly encourage individuals to think ahead of their arrival with us, and we actively promote this approach. Without first establishing a clear understanding of the desired outcome or consequences, many analysts risk being discouraged when a mannequin is provided without a predictable path forward, rendering their efforts seemingly futile? .
The technology stack appears to be a combination of Node.js and Express.js for server-side development, with React.js used for the frontend. Additionally, MongoDB seems to be the chosen NoSQL database solution.
Our information architecture starts with ingestion, featuring our proprietary tool, Fulcrum, built on top of Amazon Web Services (AWS). We also utilize an instrument called Hevo to facilitate seamless integration with external data sources. As we require data from LinkedIn, Twitter, Facebook, Salesforce, and Google, we leverage Hevo due to the impracticality of constantly updating APIs for 50 individual tools.
As part of our data management process, we ensure compliance with ELT standards by loading raw information into our centralised data repository, Redshift. Upon data ingestion, we leverage dbt to serve as a transformation layer, enabling us to quickly and efficiently manipulate the data for further analysis. Analysts leverage dbt to author their transformation logic.
Following transformations, we’ve developed Looker, a business intelligence tool enabling users to build dashboards and ask questions. We also offer Redash as an alternative querying tool, alongside Looker, catering to engineers and external individuals who require ad-hoc analysis capabilities.
Here is the improved text in a different style:
Our company has developed Reverse ETL, a cutting-edge technology that leverages our proprietary Fulcrum platform to deliver innovative solutions. We transmit data directly to platforms such as Salesforce or electronic marketing tools, streamlining the process. We also transmit a significant volume of data back to the product, covering various advice engines, as well as the search engine within the product.
Overall, we have Atlasing for information cataloguing and tracking information lineage.
Postman’s journey with Atlan revolves around enabling developers to design, build, test, and maintain APIs efficiently. As a toolset for API development, Atlan empowers teams to create scalable, secure, and reliable digital services that drive business value. By leveraging Postman’s collaboration features, developers can streamline their workflow, automate repetitive tasks, and gain valuable insights into their API performance.
As Postman’s popularity surged, a majority of inquiries revolved around “Where can I find this data?” and “What does this data mean?”, consuming an inordinate amount of our analysts’ time to respond. The existence of Atlan is predicated on this reasoning. We initiated the onboarding process by standardizing and centralizing our glossary within Atlan’s platform. Was it really the sole destination where you’d venture to comprehend the significance of your data?
Subsequently, we leveraged information lineage to identify the effects of pipeline issues on specific properties, enabling us to swiftly pinpoint and address any discrepancies found. To further enhance data security, we are leveraging lineage to identify all personally identifiable information within our warehouse and determine if we are effectively masking it.
As key users of the Atlan platform, two personas stand out: Knowledge Analysts, who leverage its capabilities to identify properties and ensure definition updates remain current; and Knowledge Engineers, who rely on Atlan to track lineage and safeguard sensitive personally identifiable information (PII). Software engineers who leverage Redash may also stand to gain, and we’re actively working to transition those users to Atlan’s more comprehensive platform.
What lies ahead for us now that we’ve completed this mission? Are you planning on building a new project that excites you within the next year?
I had previously worked at Dbt Coalesce a few months ago and was enthusiastic about it. Our team is supported by a crucial component called DataOps, which provides us with daily insights into the performance of our data ingestion processes.
As we assess our capacity for knowledge absorption, we’ll notice potential irregularities in terms of the quantity of information we’re processing, the speed at which we can absorb it, and whether our cognitive transformations are unfolding at a pace that aligns with our expectations. Will we also identify and address any damaged content on our dashboards? In-house construction has been the norm, with a significant influx of innovative tools designed to manage this process effectively. While my previous experiences brought me a sense of accomplishment, I also eagerly looked forward to exploring new tools and techniques.
As part of our efforts to optimize performance, we’ve introduced a caching layer to address the sluggish user interface in Looker, aiming to significantly reduce dashboard loading times and provide a seamless experience for our users. The caching layer efficiently loads numerous dashboards upfront, ensuring that they are readily accessible whenever a shopper visits the site. I’m genuinely enthusiastic about consistently driving down dashboard load times every week and every month.
Several large language models have recently emerged. In my opinion, the greatest limitation of information lies in discovering what exists. Many individuals are endeavoring to decipher its intricacies, not merely at the surface level, but rather in seeking a profound understanding or perspective. At some point, I aspire to have a chatbot that can respond to inquiries across the community, addressing questions such as “What’s causing my inventory to dwindle?” or “How do I address this issue with my supply chain?” We’re developing two new tools and building something from within.
While still in its infancy, the project’s profitability remains uncertain; nonetheless, to elevate customer experience, we must leverage automation to empower our information team. While a human might not be able to respond in your absence, training someone to do so could be a valuable asset.
It seems your team has a keen grasp on its effect quite properly. To foster a culture of collaboration and knowledge sharing among peers, consider implementing regular ‘Show and Tell’ sessions where team members present their current projects, share successes, and discuss challenges faced. This encourages cross-functional learning, idea generation, and potential solutions. Additionally, create a peer-to-peer mentoring program to help junior staff develop essential skills and gain confidence in their work.
That’s a really powerful query. I’ll divide this work into two items: Knowledge Engineering and Analytics.
The success of an endeavour is extra simply measurable? I focus on quality, ensuring timely delivery with precise metrics measuring efficiency.
High-quality metrics assess the accuracy of your data by ensuring compliance with established processes. If you use Jira, you likely encounter issues such as bugs and incidents, monitoring their rapid resolution. As time passes, it is essential to establish a reliable metric and monitor whether your performance ratings increase accordingly.
Availability is comparable. Are data sources readily available to those seeking insights or answers when requesting a dashboard or posing questions? If no progress is evident, carefully quantify and monitor your efforts to ensure consistent improvement.
The course of efficiency enables individuals to respond promptly when faced with a query, thereby streamlining the decision-making process. That’s a very potent one, thanks to its straightforward recommendations. When tardiness occurs, people tend to blame the information team for failing to deliver on its responsibilities, rather than considering the true cause: a lack of timely responses from those responsible for providing them.
Final is Efficiency. Although your dashboard may be wonderful, its effectiveness is irrelevant if it fails to provide timely assistance when needed. If someone opens a dashboard but it fails to load, their frustration grows, rendering even the most exceptional work irrelevant. So for me, efficiency refers to how quickly a dashboard loads. I may measure the time a dashboard takes to load, aiming for a target of just 10 seconds. Will I verify whether all parts align at that stage, and which elements are being loaded?
On the aspect, one effective way to gauge customer satisfaction is to deploy an NPS-style survey and assess whether individuals feel at ease working with your organization or not. To effectively implement the different approach, a high level of process orientation is necessary, which necessitates the utilization of ticket systems.
As a regular practice, we review our resolved analytics tickets at the end of each quarter, assessing the impact each solution has had? Can you provide insight into the number of product changes resulting from our assessment, as well as the number of business decisions that were informed by our data?
In the realm of perception technology, our influence can be traced back to two pivotal sales decisions, two strategic business operations choices, and three groundbreaking product decisions. How will you gauge that? It’s crucial to do so.
When entering a corporation with little to no experience with information groups, a typical outcome is that teams tend to conduct numerous analyses, often exceeding 10 studies. However, only one or two may ultimately impact the business, rendering the majority of efforts inconsequential. Most hypotheses are typically misconfirmed rather than properly confirmed. Documenting processes effectively enables you to accurately convey the steps taken in the last quarter, ensuring transparency and accountability. You’re seeking the capacity to articulate your scientific process effectively by contrasting phrases like “I explored ten theories, with only one being validated” or “We formulated ten predictions, and just one proved accurate,” versus expressing it in a more subjective manner as “I feel we simply had one notion that worked.”
Measure your progress, document everything accurately. You and your team should feel proud of your accomplishments; at the very least, but you can also acknowledge each individual’s role and contributions.